Thursday, 15 July 2021

The Worst CPUs Ever Made

After we covered the worst storage mediums ever, it’s now time to revisit some of the worst CPUs ever built. To make it onto this esteemed list, a CPU needed to be fundamentally broken, as opposed to simply being poorly positioned or slower than expected. The annals of history are already stuffed with mediocre products that didn’t quite meet expectations but weren’t truly bad.

Note: Plenty of people will bring up the Pentium FDIV bug here, but the reason we didn’t include it is simple: Despite being an enormous marketing failure for Intel and a huge expense, the actual bug was tiny. It impacted no one who wasn’t already doing scientific computing and the scale and scope of the problem in technical terms was never estimated to be much of anything. The incident is recalled today more for the disastrous way Intel handled it than for any overarching problem in the Pentium microarchitecture.

We also include a few dishonorable mentions. These chips may not be the worst of the worst, but they ran into serious problems or failed to address key market segments.

Intel Itanium

Intel’s Itanium was a radical attempt to push hardware complexity into software optimizations. All of the work to determine which instructions to execute in parallel was handled by the compiler before the CPU ran a byte of code. Analysts predicted Itanium would conquer the world. It didn’t. Compilers were unable to extract necessary performance and the chip was radically incompatible with everything that had come before it. Once expected to replace x86 entirely and change the world, Itanium limped along for years with a niche market and precious little else.

Itanium’s failure was particularly egregious because it represented the death of Intel’s entire 64-bit strategy (at the time). Intel had originally planned to move the entire market to IA64 rather than extend x86. AMD’s x86-64 (AMD64) proved quite popular, partly because Intel had no luck bringing a competitive Itanium to market. Not many CPUs can claim to have failed so egregiously they killed their manufacturers’ plans for an entire instruction set.

 

Intel Pentium 4 (Prescott)

Prescott doubled down on the P4’s already-long pipeline, extending it to nearly 40 stages, while Intel simultaneously shrank the P4 to a 90nm die. This was a mistake. The new chip was crippled by pipeline stalls that even its new branch prediction unit couldn’t prevent and parasitic leakage drove high power consumption, preventing the chip from hitting the clocks it needed to be successful. Prescott and its dual-core sibling, Smithfield are the weakest desktop products Intel ever fielded relative to its competition at the time. Intel set revenue records with the chip, but its reputation took a beating.

AMD Bulldozer

AMD’s Bulldozer was supposed to steal a march on Intel by cleverly sharing certain chip capabilities to improve efficiency and reduce die size. AMD wanted a smaller core, with higher clocks to offset any penalties related to the shared design. What it got was a disaster. Bulldozer couldn’t hit its target clocks, drew too much power, and its performance was a fraction of what it needed to be. It’s rare that a CPU is so bad, it nearly kills the company that invented it. Bulldozer nearly did. AMD did penance for Bulldozer by continuing to use it. Despite the cores flaws, it formed the backbone of AMD’s CPU family from late 2011 through early 2017.

Cyrix 6×86

Cyrix was one of the x86 manufacturers that didn’t survive the late 1990s (VIA now holds their x86 license). Chips like the 6×86 were a major part of the reason why. Cyrix has the dubious distinction of being the reason why some games and applications carried compatibility warnings. The 6×86 was significantly faster than Intel’s Pentium in integer code, but its FPU was abysmal and its chips weren’t particularly stable when paired with Socket 7 motherboards. If you were a gamer in the late 1990s, you wanted an Intel CPU but could settle for AMD. The 6×86 was one of the terrible “everybody else” chips you didn’t want in your Christmas stocking.

The 6×86 failed because it couldn’t differentiate itself from Intel or AMD in a way that made sense or gave Cyrix an effective niche of its own. The company tried to develop a unique product and wound up earning itself a second place on this list instead.

Cyrix MediaGX

The MediaGX was the first attempt to build an integrated SoC processor for desktop, with graphics, CPU, PCI bus, and memory controller all on one die. Unfortunately, this happened in 1998, which means all those components were really terrible. Motherboard compatibility was incredibly limited, the underlying CPU architecture (Cyrix 5×86) was equivalent to Intel’s 80486, and the CPU couldn’t connect to an off-die L2 cache (the only kind of L2 cache there was, back then). Chips like the Cyrix 6×86 could at least claim to compete with Intel in business applications. The MediaGX couldn’t compete with a dead manatee.

The entry for the MediaGX on Wikipedia includes the sentence “Whether this processor belongs in the fourth or fifth generation of x86 processors can be considered a matter of debate.” The 5th generation of x86 CPUs is the Pentium’s generation, while the 4th generation refers to 80486 CPUs. The MediaGX shipped in 1997 with a CPU core stuck somewhere between 1989 and 1992, at a time when people really did replace their PCs every 2-3 years if they wanted to stay on the cutting edge. It also notes, “The graphics, sound, and PCI bus ran at the same speed as the processor clock also due to tight integration. This made the processor appear much slower than its actual rated speed.” When your 486-class CPU is being choked by its own PCI bus you know you’ve got a problem.

Texas Instruments TMS9900

The TMS9900 is a noteworthy failure for one enormous reason: When IBM was looking for a chip to power the original IBM PC, they had two basic choices to hit their own ship date — the TMS9900 and the Intel 8086/8088 (the Motorola 68K was under development but wasn’t ready in time). The TMS9900 only had 16 bits of address space, while the 8086 had 20. That made the difference between addressing 1MB of RAM and just 64KB. TI also neglected to develop a 16-bit peripheral chip, which left the CPU stuck with performance-crippling 8-bit peripherals. The TMS9900 also had no on-chip general purpose registers; its 16 16-bit registers were all stored in main memory. TI had trouble securing partners for second-sourcing and when IBM had to pick, it picked Intel. The rest is history.

Dishonorable Mention: Qualcomm Snapdragon 810

The Snapdragon 810 was Qualcomm’s first attempt to build a big.Little CPU and was based on TSMC’s short-lived 20nm process. The SoC was easily Qualcomm’s least-loved high-end chip in recent memory — Samsung skipped it altogether and other companies ran into serious problems with the device. QC claimed that the issues with the chip were caused by poor OEM power management, but whether the problem was related to TSMC’s 20nm process, problems with Qualcomm’s implementation, or OEM optimization, the result was the same: A hot-running chip that won precious few top-tier designs and is missed by no one.

Dishonorable Mention: IBM PowerPC G5

Apple’s partnership with IBM on the PowerPC 970 (marketed by Apple as the G5) was supposed to be a turning point for the company. When it announced the first G5 products, Apple promised to launch a 3GHz chip within a year. But IBM failed to deliver components that could hit these clocks at reasonable power consumption and the G5 was incapable of replacing the G4 in laptops due to high power draw. Apple was forced to move to Intel and x86 in order to field competitive laptops and improve its desktop performance. The G5 wasn’t a terrible CPU, but IBM wasn’t able to evolve the chip to compete with Intel.

Dishonorable Mention: Pentium III 1.13GHz

The Coppermine Pentium III was a fine architecture. But during the race to 1GHz against AMD, Intel was desperate to maintain a performance lead, even as shipments of its high-end systems slipped further and further away (at one point, AMD was estimated to have a 12:1 advantage over Intel when it came to actually shipping 1GHz systems). In a final bid to regain the performance clock, Intel tried to push the 180nm Cumine P3 up to 1.13GHz. It failed. The chips were fundamentally unstable and Intel recalled the entire batch.

Dishonorable Mention: Cell Broadband Engine>

We’ll take some heat for this one, but we’d toss the Cell Broadband Engine on this pile as well. Cell is an excellent example of how a chip can be phenomenally good in theory, yet nearly impossible to leverage in practice. Sony may have used it as the general processor for the PS3, but Cell was far better at multimedia and vector processing than it ever was at general purpose workloads (its design dates to a time when Sony expected to handle both CPU and GPU workloads with the same processor architecture). It’s quite difficult to multi-thread the CPU to take advantage of its SPEs (Synergistic Processing Elements) and it bears little resemblance to any other architecture.

What’s the Worst CPU Ever?

It’s surprisingly hard to pick an absolute worst CPU. Is it more important that a CPU utterly failed to meet overinflated expectations (Itanium) or that the CPU core nearly killed the company that built it (Bulldozer)? Do we judge Prescott on its heat and performance (bad, in both cases) or on the revenue records Intel smashed with it?

Evaluated in the broadest possible meanings of “worst,” I think one chip ultimately stands feet and ankles below the rest: The Cyrix MediaGX. It is impossible not to admire the forward-thinking ideas behind this CPU. Cyrix was the first company to build what we would now call an SoC, with PCI, audio, video, and RAM controller all on the same chip. More than 10 years before Intel or AMD would ship their own CPU+GPU configurations, Cyrix was out there, blazing a trail.

It’s unfortunate that the trail led straight into what the locals affectionately call “Alligator Swamp.”

Designed for the extreme budget market, the Cyrix MediaGX appears to have disappointed just about anyone who ever came in contact with it. Performance was poor — a Cyrix MediaGX 333 had 95 percent the integer performance and 76 percent of the FPU performance of a Pentium 233 MMX, a CPU running at just 70 percent of its clock. The integrated graphics had no video memory at all. There’s no option to add an off-die L2 cache. If you found this under your tree, you cried. If you had to use this for work, you cried. If you needed to use a Cyrix MediaGX laptop to upload a program to sabotage the alien ship that was going to destroy all of humanity, you died.

All in all, not a great chip.

Now Read:



No comments:

Post a Comment