Settlement reached in fatal Kentucky police shooting of Breonna Taylor: Sources

first_imgBreonna Taylor FamilyBy BILL HUTCHINSON, SABINA GHEBREMEDHIN and STEPHANIE WASH, ABC News(LOUISVILLE, Ky.) — The city of Louisville, Kentucky, has reached a multimillion-dollar settlement with the family of Breonna Taylor, the emergency medical technician shot to death by police in her own home, sources told ABC News.The settlement is expected to be announced by city officials and Taylor’s family on Tuesday afternoon and includes a police reform package, sources said.Civil rights attorney Benjamin Crump, who is representing Taylor’s family, has called a news conference for 2 p.m. ET at the Louisville mayor’s office to announce the “significant update in the Breonna Taylor case.” Crump said in a statement that members of Taylor’s family will attend the news conference.Taylor and her boyfriend, Kenneth Walker, were sleeping inside their Louisville apartment on March 13 when officers with the Louisville Metro Police Department attempted to execute a “no-knock” search warrant. Three plainclothes officers opened Taylor’s front door and “blindly” opened fire into their apartment, according to a wrongful death lawsuit filed in April by Taylor’s mother, Tamika Palmer.Taylor, a licensed EMT, was shot at least eight times and died, according to the lawsuit filed by Palmer.Taylor was accused of accepting USPS packages for an ex-boyfriend who police were investigating as an alleged drug trafficker, according to the warrant.The police said they knocked several times before using a ram to open the door and were allegedly met with gunfire. Walker said he called 911 before firing one shot from his licensed firearm, hitting one of the officers in the leg.The three officers involved in the shooting, Sgt. Jonathan Mattingly and detectives Brett Hankinson and Myles Cosgrove, were placed on administrative reassignment pending the results of an investigation. Hankison was later fired for his role in the incident.According to his termination letter that was shared with local reporters, Hankison violated procedure when he fired 10 rounds into Taylor’s apartment while executing the warrant.“I have determined you violated Standard Operating Procedure … when your actions displayed an extreme indifference to the value of human life when you wantonly and blindly fired ten rounds into the apartment of Breonna Taylor,” the letter stated.No charges have been filed against the officers. Daniel Cameron, Kentucky’s attorney general, released a statement this week saying the investigation was still ongoing.Copyright © 2020, ABC Audio. All rights reserved.last_img read more

Vermont Yankee automatically shutdown

first_imgThe Vermont Yankee nuclear power station automatically shut down yesterday at approximately 3:25 pm.  The plant was at 70 percent of its normal output after restarting from its refueling and maintenance outage. Plant systems responded safely as designed. Plant technicians are investigating the cause of the shutdown. Initial indications are that the shutdown was caused by a problem 345KV switchyard located outside the plant. There has been no release of radiation. The plant will be restart after the problem has been identified and repairs have been completed. Source: Vermont Yankee. 5.26.2010last_img read more

MLB wrap: Dallas Keuchel pegged with loss in Braves debut

first_imgOnce Keuchel shakes off the rust, he will hopefully live up to the expectations Atlanta had when it signed the lefty to help cement its place in the National League East.The Braves, who built a small cushion at the top of the division, still hold a 4 1/2 game advantage over the second-place Phillies. But it’s the Nationals who are hitting their stride at the right time. Washington, which sits in third place, has won five consecutive games and nine of 12. A few more wins and the Nationals could catch up to the Phillies and put pressure on the Braves. Related News Athletics pitcher Frankie Montas suspended 80 games It certainly wasn’t the start Dallas Keuchel or the Braves had in mind when the Cy Young Award winner took the mound Friday. But the loss, a 4-3 defeat to the Nationals, should not be cause for concern for either party. Keuchel was pitching in an MLB game for the first time since last year’s American League Championship series. His debut lasted five innings as he gave up eight hits and three earned runs while striking out three. MLB trade rumors: Yankees’ Brian Cashman would do ‘whatever it takes’ to land Max Scherzer MLB All-Star Game 2019: Finalists include frontrunners Dodgers’ Cody Bellinger, Brewers’ Christian Yelich Their win Friday was highlighted with a home run off the bat of Yan Gomes. It was just his third of the season. Anthony Rendon had the go-ahead RBI in the fifth inning to reach the game-winning total.Stephen Strasburg was awarded the win after six innings of work where he allowed five hits and three earned runs while striking out five.Studs of the NightFreddie Freeman has been on fire. The Braves first baseman joined Andruw Jones as the only players with the franchise to have at least one RBI in nine consecutive games since they moved to Atlanta in 1966. Joe Musgrove gave up just five hits in seven one-run innings in the Pirates’ 2-1 win over the Padres. He moved to 5-7 on the season as Pittsburgh picked up its fourth win in six games.It was a heck of a night for Jeff McNeil. The Mets’ leadoff man made his first career start in right field, and it wasn’t a problem for the “Squirrel.” He hit a home run and drove in three runs as New York squeaked by the Cubs, 5-4.Walker Buehler had a career game as the Dodgers faced the Rockies. The righty had a career-high 16 strikeouts in nine innings of work. It was the most strikeouts by a Los Angeles pitcher under 25 since Ramon Martinez had 18 in a game in 1990. It was also the most innings he has pitched all season. Duds of the NightThe Phillies could have used some help from the middle of their lineup but Jay Bruce went 0 for 5 at the plate and J.T. Realmuto followed that up with an 0 for 4 night — his fourth straight hitless game. Philadelphia went on to lose 2-1 to Miami.HighlightsGary Sanchez hit a dinger. His 481-foot home run is the fourth longest for a Yankees player in the Statcast era. 481 feet.Oh my, Gary. 😱 #Crushed pic.twitter.com/Cc1uE97NrB— MLB (@MLB) June 22, 2019There must of been something in the air Friday, because these homers were going the distance. Rangers outfielder Nomar Mazara tied the record for longest home run in Statcast history.5️⃣0️⃣5️⃣ FEET! 😱😱😱 pic.twitter.com/C40SrLUXIl— MLB (@MLB) June 22, 2019What’s Next?Blue Jays (27-49) at Red Sox (42-35), 4:05 p.m. ET — Boston is hoping this is the start of another winning streak. The Red Sox went the distance with the Blue Jays on Friday and won on a walkoff home run in extra innings. It was the team’s second straight win. Boston will send Brian Johnson (1-1, 10.00 ERA) to the mound as it tries to keep up with second-place Tampa Bay in the American League East.last_img read more

Diddy’s son defends acceptance of UCLA football scholarship

first_img(NNPA)—Justin Combs, son of hip-hop mogul Sean “Diddy” Combs, took to Twitter last week to defend his full scholarship to UCLA, which the school confirmed this week. Many in cyberspace are questioning whether the multimillionaire’s son should have accepted the $54,000 football scholarship, given the school’s economic troubles.But the 18-year-old said he earned the scholarship. “Regardless what the circumstances are, I put that work in!!!! PERIOD,” he tweeted on May 30.“Regardless of what you do in life every1 is gonna have their own opinion,” he tweeted. “Stay focused, keep that tunnel vision & never 4get why u started.”Combs, a 5-foot-9, 170-pound defensive back, reportedly graduated from New York’s New Rochelle Iona Prep with a 3.75 GPA, according to the Los Angeles Times.UCLA defended its decision, saying Combs’ award was not siphoned from need-based scholarships to other students.Athletic scholarships are “entirely funded by Athletic Department ticket sales, corporate partnerships, media contracts and private donations” and “do not rely on state funds,” university spokesman Ricardo Vazquez told the Times.“There is a big separation between financial aid based on need and how that’s funded and how athletic scholarships are funded and awarded to students,” he added.(Reprinted from the Afro American) SEAN AND JUSTIN COMBS last_img read more

ThurstonTalk Wins “New Business of the Year”

first_img* EDC Video On ThurstonTalk.com Owners Dan Jones and Martin McElliottEach year, the Thurston County Economic Development Council (EDC) acknowledges the significant role that private business and non-profit organizations play in creating a healthy and diverse economy throughout Thurston County.In March, community leaders nominated ThurstonTalk as the New Business of the Year.  During today’s annual meeting, the EDC announced ThurstonTalk as the winner!“It’s an honor to be recognized by the leaders of Thurston County.  I give a tremendous amount of credit for our success to my business partner, Martin McElliott.  He has done an outstanding job embracing the vision of ThurstonTalk while executing our strategic business plan,” states ThurstonTalk founder, Dan Jones.Over the past 18 months, ThurstonTalk has grown to 24 team members, attracting hundreds of local business customers.  Rapidly, ThurstonTalk has emerged as a community asset, reaching over 150,000 local views each month.  ThurstonTalk is the leader in promoting positive information about people, businesses, and organizations doing good things in Thurston County.“This is an award recognizing our entire team.  I thank them for their dedication and professionalism,” continues Jones. Facebook0Tweet0Pin0center_img About ThurstonTalkTim Shaw, Brent Bryant, Amy Rowley, Martin McElliottThurstonTalk.com is an information source serving the Thurston County community—from Olympia, Lacey and Tumwater to Tenino, Yelm, Rainier and beyond. ThurstonTalk.com officially launched on January 1, 2011, and has grown to become a dominant voice for local businesses, events, news and sports.After five years of development and leveraging 11 years’ worth of efforts, ThurstonTalk.com was created on a strong business model.A vibrant community needs an information source that has the ability to interact with community members through multiple tools, while adding a meaningful advertising platform for local businesses.last_img read more

Thurston County Offers Neighborhood Preparedness Workshop

first_imgFacebook0Tweet0Pin0 Submitted by Thurston CountyWhen disaster strikes, the first to respond to you and your family are often not emergency workers, but your neighbors. Get the tips, tricks and tools you need to get you and your neighbors ready to stay safe and survive the next big disaster at the Map Your Neighborhood workshop on January 15.The award winning Map Your Neighborhood program is designed to improve disaster readiness at the neighborhood level by teaching neighbors to rely on each other during the hours or days before fire, medical, police, or utility responders may be able to reach them after a disaster. The Map Your Neighborhood—Train-the-Trainer workshop on January 15 will not only offer tips and training on how to prepare for and respond to a disaster, it will also offer community organizing instruction and materials to help you get your neighbors engaged and create your own disaster response team for your community.For more information about the January 15 workshop, or for more information on the Thurston County Emergency Management Division, visit www.co.thurston.wa.us/em or contact Vivian Eason at [email protected] or (360) 867-2825.WHAT:   FREE! Map Your Neighborhood: Train-the-Trainer WorkshopWHEN:   Tuesday, January 15, 6:30 – 8 p.m.WHERE:   Thurston County Emergency Coordination Center at 9521 Tilley Road SW, 98512, just south of the Tumwater city limits.last_img read more

Taapsee Pannu is a miracle in pink, and this is not about the movie

first_imgIf you’re having a bad day, you need to look at this spell-casting combination of Taapsee Pannu and pink–no, we’re not talking about the movie Pink here, although it is no less a wonder.The beautiful Naam Shabana actor donned a lustrous, cotton dress, which she styled with a chic white shirt, and redefined the term fusion dressing for us.Taapsee looked absolutely gorgeous in the hot pink, one-shoulder dress, and we’ve learnt a thing or two about mix-matching clothes like a genius!Picture courtesy: Instagram/urvashisoneja The cotton textile attire was absolutely bewitching owing to that magnificent shimmer, while the white, circular pattern added an edge to the Urvashi Soneja number.Also Read: Ranveer Singh is back in black, so let’s just breathe nowThe sensuality of the hot pink dress, and the class that came with the white shirt, blend so well that we’d never be able to look at a white shirt without wanting to style it like Taapsee.The flared sleeves of the dress contrasted enigmatically with the buttoned sleeves of the shirt, and created a fusion of drama and poise in the ensemble. And, that’s how you get the best of both worlds.Picture courtesy: Instagram/urvashisoneja Taapsee, who was every bit of a stunner in this enthralling attire kept her hair tied in a low bun, keeping it classy and chic.Also Read: Let Twinkle Khanna teach you how to dress sexy for the summerThe lovely actor teamed up her super-stylish outfit with a pair of open-toe black stilettos, and sported minimal makeup.advertisementTaapsee totally rocked this fusion look, and it’s time we give her an award already.last_img read more

FIFA U-17 World Cup: Fans should be very proud of what India has done, says England coach

first_imgEngland coach Steve Cooper on Friday lauded hosts India’s performance in their maiden FIFA U-17 World Cup, stating that the fans can be mighty pleased with the way their team played.India, playing in a World Cup at any level for the first time, lost all their three matches crashing out of Group A but gave a good account of themselves especially against Colombia. (India’s future in safe hands as Dheeraj Singh shines bright at FIFA U-17 World Cup)Luis Norton de Matos’ colts lost 1-2 but levelled in spectacular fashion through Jeakson Singh in the second half and came close to eking out a point in the final stages of the game.Against the USA, they lost 0-3 but were impressive while a much superior Ghana thrashed them [email protected] team lost d match but won hearts for their sheer spirit & determination.Well played! We are proud of u.#FIFAU17WC #SAI?????? pic.twitter.com/y1DcJ11fgP- SAIMedia (@Media_SAI) October 13, 2017The U17 @IndianFootball team inspired millions of aspiring young footballers across nation wid it’s indomitable spirit. #FIFAU17WC #SAI?????? pic.twitter.com/SyF9IAAlek- SAIMedia (@Media_SAI) October 13, 2017″I did and we made a real point with the players of watching them, certainly the opening game for two reasons. One we respect the home nation. What a wonderful experience the boys had in representing the country in a home tournament like this,” coach Cooper told reporters on the eve of their game against Iraq in Group F.England have already booked their place in the knockouts riding two wins in two matches.advertisementCooper said coach Matos created a good identity for the team and the supporters can be very proud with the way the hosts approached all three matches.”The results haven’t gone India’s way but they must be very proud of the performances. Congratulations to the coach, he has created a real good identity. He was really clear as to how he wants the team to play. I personally think the fans should be very proud of what India has done.”Cooper said he spent some time with India counterpart Stephen Constantine during their pre-tournament camp in Mumbai and learnt more about India’s football.”I have spent some time with Stephen Constantine in the build-up in Mumbai. He spoke very well of how they develop footballers here.”Memorable moments of great sportsman spirit at the #FIFAU17WC, conveys the essence of true sportsman.#IndiavGhana @IndianFootball #SAI ?????? pic.twitter.com/TW9OoLAIyA- SAIMedia (@Media_SAI) October 13, 2017last_img read more

A history of microprocessor debug, 1980–2016

first_img July 27, 2017 at 5:11 pm Log in to Reply Log in to Reply “I wrote a blog on a similar subject some time agonnHow It Was: Programming (and debugging) microprocessorsnnhttp://www.eetimes.com/author.asp?section_id=14&doc_id=1285606” Log in to Reply Log in to Reply February 25, 2019 at 11:14 am Log in to Reply Log in to Reply “Excellent! Informative!! I am glad while searching i found the post, all the information is qualitative and helpful to grow my business. Top Tips!!nmore-https://beldara.com/supplier/computer-it-telecom/Computer-Hardware/CPU-.html” “The author consistently mis-names the programmable memory as EEPROM. As is shown in the photo they are EPROMs. EE= electrically eraseable and so would not need UV light. EPROM= Erasable Programmable etc.nnAnd no, EEPROM was not originally intended for p “I’d be more surprised to hear that the debug/release problem had gone away. Even something as simple as blinking an LED can change the timing sufficiently to cause problems to disappear or change their nature in unpredictable ways.nnSounds like you enc antedeluvian says: Continue Reading Previous Is the surveillance industry ripe for a storage change?Next Deeply reported features provide technology insight for engineers July 26, 2017 at 8:57 pm Ryan Crawford says: elizabethsimon says: July 27, 2017 at 12:05 am “I remember a large portion of this history (have I really been at this that long). One of the biggest problems with many of the debugging techniques mentioned is that they require substitution (modification) of either hardware (ICE) or software (debug cod Ryan Crawford says: antedeluvian says: antedeluvian says: antedeluvian says: Share this:TwitterFacebookLinkedInMoreRedditTumblrPinterestWhatsAppSkypePocketTelegram Tags: Communications, Consumer, EDA, Industry, Medical Since the dawn of electronics design, where there have been designs, there have been bugs. But where they have been bugs, there inevitably was debug, engaged in an epic wrestling match with faults, bugs, and errors to determine which would prevail — and how thoroughly.In many ways, the evolution of debug technology is as fascinating as any aspect of design; but it rarely receives the spotlight. Debug has evolved from simple stimulus-response-observe approaches to sophisticated tools, equipment, and methodologies conceived to address increasingly complex designs. Now, in 2017, we sit at the dawn of a new and exciting era with the introduction of debug over functional I/O.This is the culmination of decades of hard work and invention from around the globe. I’ve been involved in debug since 1984, so to truly appreciate the paradigm shift we’re now experiencing in debug, it’s useful to take a look back at the innovation that has taken place over the years.1970s-1980s System design was very different in this period compared to the way things are today. A typical system would consist of a CPU, (EP)ROM, RAM, and some peripherals (PIC, UART, DMA, TIMERs, IO…), each implemented in its own IC.1980s single-board computer (SBC)(Source: http://oldcomputers.net/ampro-little-board.html) The typical development flow was to write your code in ASM or C and get it compiled, linked, and located so that you ended up with a HEX file for the ROM image. You would then take the old EEPROM(s) out of the sockets on the target board, place them in a UV EEPROM Eraser, and blast them with UV light for 20 mins.EPROM Eraser(Source: https://lightweightmiata.com/arcade/area51/area5114.jpg) You then placed the EEPROM(s) into an EEPROM programmer and downloaded the HEX file from your computer (typically via a serial or parallel interface) to program them up.EPROM Programmer(Source: http://www.dataman.com/media/catalog/product/cache/1/image/9df78eab33525d08d6e5fb8d27136e95/m/e/mempro.jpg) Finally, you plugged the EPROM(s) back into the target board and powered it up to see if your program worked. If your program didn’t function as expected, then you had several options available for debugging your code as follows:Code Inspection: In this case, you would walk through your code staring long and hard at it looking for errors. This technique is still used today by those who view the use of any debugging tool as a failure of programming skill! The other reason you would do this is if the following techniques were either not available to you due to hardware restrictions or because of the cost.LEDs: This technique is also still in use today. If you happen to have LEDs, or any other indicator on the target system, you can determine the path through your code by modifying the code to signal a state at significant places in the code. You can then just look at the LEDs to see the progress (or often lack of progress) through your code, thus helping you to determine where to focus your attention. (See also When life fails to provide a debugging interface, blink a RGB LED .) If you had several spare digital IOs and were lucky enough to have access to a logic analyser, you could effectively trace your path through the code in real time by tracing the states (locations) output by your program.On target monitor: For those target boards that had a serial port (RS232) and enough free EPROM/RAM to include a monitor program, you could step through your code at the assembly level and display the contents of registers and memory locations. The monitor program was effectively a low-level debugger that you included in your own code. At some place in your program, you would jump into the monitor program and start debugging. The serial port was used to interact with the monitor program and the user would issue commands such as “s” to step an instruction and “m 83C4,16” to display the contents of 16 locations is memory starting at address 0x83C4, for example. Once the code was working as expected, the final program would usually be built without the monitor in place.In-Circuit Emulator: For those who could afford it, the In-Circuit Emulator (ICE) was the ultimate debug tool. In some ways, this tool provided more functionality than the state-of-the-art debug tools provide developers today! The ICE would replace the CPU in the target system with electronics that emulated the CPU. These ICE tools were large (far larger than a desktop PC) and very expensive — we are talking many thousands of dollars. In this era, the ICE was typically designed by the CPU manufacturer or one of the major tool companies of the time (Tektronix, HP/Agilent, Microtek, etc.) and would contain a ‘bond-out’ version of the CPU under emulation. The bond-out CPU literally had extra internal signals brought out to pins on the device so that the emulator could both control the CPU and gain extra visibility into its internal operation. The emulator could watch the operations performed by the CPU and would provide for complex breakpoints and tracing functionality that would be the envy of many a developer today. It was also possible to replace an area of on-target memory (typically the EPROM) with emulation RAM contained in the ICE. This let you download your code into the emulation RAM — no more erasing and blowing of EPROMs during development — bliss!Motorola Exorciser ICE(Source: http://www.exorciser.net/personal/exorciser/Original%20Files/exorciser.jpg) Intel MDS ICE(Source: http://www.computinghistory.org.uk/userdata/images/large/PRODPIC-731.jpg) 1982-1990 During the 1980s, three main changes evolved for the embedded developer. The first was that more integrated ICs started to appear that contained combinations of CPU, PIC, UART, DMA — all included within the one device. Examples would be the Intel 80186/80188, which was an evolution of the 8086/8088 CPUs (original IBM PC), the Zilog Z180, which was an evolution of the Z80 (Sinclair Spectrum), and the Motorola CPU32 family (e.g., the 68302), which was an evolution of the 68000 (Apple Lisa).The second was that the ICE became much more accessible to developers. Several companies had started manufacturing ICE tools at much lower cost than the CPU manufacturers’ systems. Many of these companies did not use bond-out chips. Whilst this led to a small decrease in available functionality, it significantly contributed to the increased availability of lower-cost ICE products. An ICE for an 80186 could now be picked up for less than $10,000.The third was that the ever-increasing CPU clock speeds started to cause problems for ICE technology. This placed significant challenges on the cabling systems that ICEs used, and started to cause problems with the emulation control technology, which just could not operate at these high speeds without becoming seriously expensive (again). CPU manufacturers were also becoming more reluctant to create bond-out versions of the CPUs since the extra on-chip connections interfered with chip operation. The solution to these problems was to build the CPU debug control circuitry on-chip. This allowed for single step, memory and register access, and breakpoint technology to operate at full CPU speed, but did not at this time provide for trace, which still needed access to the device external bus interface pins.This trace was also less functional since for many internal peripheral accesses the external bus was not used. Hence, only external accesses were fully visible and the internal peripheral accesses were dark. Access to the on-chip debug (OCD) technology was either via a proprietary interface technology — typically referred to as BDM (Background Debug Mode) — or via the standard JTAG interface, which was more traditionally used for production test rather than debug. These interfaces allowed companies to create low-cost debug tools for control of CPU execution with no clock speed limitations. Features varied slightly between implementations; for example, some allowed the debug tool to access memory while the CPU was executing, whilst others did not.1990-2000 External trace pretty much died out. The increase in CPU clock speeds, coupled with the introduction of internal CPU cache, made external trace pretty much useless. However, to diagnose more complex program defects, there was still a requirement to be able to record the execution path of the CPU. The challenge was how to do this using on-chip logic (so it can operate at full CPU speed) but to transport the trace data off chip at a feasible clock rate using as few pins as possible. The solution was to transform the execution path of the CPU into a compressed data set, which could be transported off-chip and captured by a debug tool. The tool can then use the data set to reconstruct the execution path. It was realized that if the debug tool had access to the executed program, the compression could be lossy. For example, if only the non-sequential program counter changes were output, the debug tool could “fill in the gaps” using knowledge of the program being executed. IBM’s PowerPC, Motorola’s ColdFire CPUs, ARM’s 7TDMI based cores, and others all implemented trace systems based on this concept.2000-2010 With the introduction of compressed core trace datasets, it became feasible to choose between transporting the dataset off chip and/or using a relatively small on-chip trace buffer to hold the data. In the early 2000s, various vendors strived to improve trace performance; ARM, for example, architected the Embedded Trace Buffer (ETB), which was accessible via JTAG and configurable in size to hold the trace data. This solved the issue of having to provide a relatively high speed off-chip trace port (though still nowhere near core clock speed) at the expense of using silicon area in the SoC.In the mid-2000s, embedded CPU designers started to implement multi-core systems. The designs using ARM IP made use of JTAG technology, with each core appearing in the serial JTAG scan chain. This was not a problem until core power management was implemented, which resulted in cores losing their presence on the JTAG serial scan chain when powered down. JTAG does not support devices appearing and disappearing from the serial scan chain, so this caused complications for both debug tooling and SoC designers. To overcome this, ARM created a new debug architecture called CoreSight. This allowed a single JTAG-based debug access port (one device on the JTAG scan chain) to provide access to many memory-mapped CoreSight components, including all of the ARM cores in the system. Now, CoreSight-compliant devices were free to power down without affecting the JTAG scan chain (you can read more about CoreSight technology in this new whitepaper). This technology is still in use in more modern — and much more complicated — ARM IP-based systems that are designed today.2010- As embedded processors increased in capability — especially with the advent of 64-bit cores — it became more feasible to support on device debug. Previously, the typical debug system used debug tooling on a high-powered workstation utilizing a JTAG/BDM connection to the target system to control execution/trace. As Linux/Android gained widespread use, the kernel was augmented with device drivers to access the on-chip CoreSight components. By utilizing the perf subsystem, on-target trace capture and analysis is now possible.With the introduction of the ARM Embedded Logic Analyser (ELA), it is now possible to return to the days of the ICE and have access to complex on-chip breakpoints, triggers, and trace with access to internal SoC signals — just like the old bond-out chips used to provide in the early 1980s.Today, after 40 years of innovation, we’re on the cusp of a new era in debug, one in which engineers can perform debug and trace over functional I/O, thereby saving both time and money. The push for performing debug over existing device interfaces will not only provide a leaner solution, but will also help step up debug and trace capability to the next level. Thus begins a new chapter in our fascinating and long history in the war against bugs. “@RyannnSorry- I missed your wink.” “My apologies, that’s what I get for skipping over the Editor’s Note.” Beldara MarketPlace says: elizabethsimon says: 11 thoughts on “A history of microprocessor debug, 1980–2016” “No worries, Clive! Reading your How It Was blog post right now, it’s excellent. I’m always picking our tenured engineers’ brains about this stuff, it’s fascinating.” July 27, 2017 at 3:36 pm July 27, 2017 at 5:03 pm July 27, 2017 at 3:58 pm Log in to Reply Log in to Reply July 26, 2017 at 11:49 pm Log in to Reply July 26, 2017 at 6:49 pm “@RyannnNot that I mind being mistaken for Max (Clive) Maxfield, and it’s understandable given the format of the blog I pointed to, but I should point out that I am not he.nn-Aubrey (Kagan)” Ryan Crawford says: Log in to Reply July 26, 2017 at 8:11 pm “I thought the point of EEPROM was to make UV erasure unnecessary ;)” “@ElizabethnnThe Debug/release problem is still with us these 40 years later. I have recently had several run-ins with the Keil ARM compiler for the PSoC where the program ran in debug but failed in release as a result of different optimization approach Log in to Reply antedeluvian says: July 26, 2017 at 5:53 pm Leave a Reply Cancel reply You must Register or Login to post a comment. This site uses Akismet to reduce spam. Learn how your comment data is processed.last_img read more