After we covered some of the worst storage media ever made, it’s time to revisit some of the worst CPUs ever made. To make it onto this esteemed list, a CPU needs to be fundamentally broken, not simply poorly positioned or slower than expected. History is already littered with mediocre products that didn’t quite live up to expectations but weren’t truly terrible either.
NOTE: A lot of people will bring up the Pentium FDIV bug here, but the reason we didn’t include it is simple: even though it’s a huge marketing Despite Intel’s failures and huge expenses, the actual mistakes were minimal. It won’t affect anyone who isn’t already doing scientific computing, and the size and scope of the problem have never been estimated to be that big of a deal from a technical perspective. The incident is recalled today more for the disastrous way Intel handled it than for any overall issues in the Pentium microarchitecture.
We also include some dishonorable mentions. These chips may not be the worst, but they suffer from serious problems or fail to address key market segments.

Intel Itanium
Intel’s Itanium is a radical attempt to push hardware complexity into software optimization. Before the CPU runs a single byte of code, the compiler handles all the work of determining which instructions to execute in parallel. Analysts predict Itanium will conquer the world. This is not the case. The compiler couldn’t extract the necessary performance, and the chip was simply incompatible with everything that came before it. Itanium was once thought to completely replace x86 and change the world, but it has faltered in a niche market for years and offered little else.
Itanium’s failure was particularly shocking because it represented the demise of Intel’s entire 64-bit strategy (at the time). Intel originally planned to move the entire market to IA64, not expand x86. AMD’s x86-64 (AMD64) proved extremely popular, in part because Intel failed to bring a competitive Itanium to market. Not many CPUs can claim to have glitches so severe that they killed the manufacturer’s plans for an entire instruction set.

Intel Pentium 4 (Prescott)
Prescott doubled down on P4’s already long pipeline, expanding it to nearly 40 stages while Intel simultaneously shrunk P4 chips to 90 nanometers. This is a mistake. The new chip was crippled by pipeline stalls that even the new branch prediction unit couldn’t prevent, and parasitic leaks caused high power consumption that prevented the chip from hitting the clock needed to succeed. The Prescott and its dual-core sibling, the Smithfield, were the weakest desktop products Intel had ever released relative to its competitors at the time. Intel set revenue records with the chip, but its reputation took a hit.

AMD Bulldozer
AMD’s Bulldozer was supposed to give Intel a head start by cleverly sharing certain chip features to improve efficiency and reduce chip size. AMD wanted a smaller core with higher clock speeds to offset any penalties associated with a shared design. The result was a disaster. Bulldozer failed to hit its target clock, consumed too much power, and only performed at a fraction of what it needed to. Rarely has a CPU been so bad that it nearly killed the company that invented it. Bulldozer almost did it. AMD makes up for it by continuing to use Bulldozer. Despite its core flaws, it became the mainstay of AMD’s CPU lineup from late 2011 to early 2017.

Sirix 6×86
Cyrix was one of the x86 manufacturers that didn’t survive the late 1990s (VIA now holds its x86 license). Chips like the 6×86 are the main reason for this. Cyrix has questionable properties and is the reason why some games and applications come with compatibility warnings. The 6×86 is much faster than Intel’s Pentium in integer code, but its FPU is terrible, and its chip isn’t particularly stable when used with a Socket 7 motherboard. If you were a gamer in the late 1990s, you wanted an Intel CPU, but you could also choose AMD. 6×86 is one of the worst “other people” chips No Want it in your Christmas stocking.
6×86 failed because it couldn’t differentiate itself in a meaningful way from Intel or AMD, and it couldn’t provide Cyrix with a valid niche of its own. The company tried to develop a unique product, but ended up finishing second on this list.

Cyrix MediaGX
MediaGX is the first attempt to build a desktop integrated SoC processor in which graphics, CPU, PCI bus and memory controller are all integrated on a single chip. Unfortunately, this happened in 1998, which meant that all of these components were real Bad. Motherboard compatibility was very limited, the underlying CPU architecture (Cyrix 5×86) was equivalent to Intel’s 80486, and the CPU could not connect to the off-chip L2 cache (the only L2 cache at the time). Chips like the Cyrix 6×86 can at least claim to compete with Intel in commercial applications. MediaGX can’t compete with Dead Manatee.
The Wikipedia entry for MediaGX includes the sentence “Whether this processor belongs to the fourth or fifth generation of x86 processors can be considered a controversial issue.” The 5th generation x86 CPU is the Pentium generation, while the 4th generation refers to the 80486 CPU. MediaGX was launched in 1997, and its CPU cores were stuck between 1989 and 1992, when people really had to replace their PCs every 2-3 years if they wanted to stay ahead of the curve. It also states that “due to tight integration, the graphics, sound, and PCI buses run at the same speed as the processor clock. This makes the processor appear to be much slower than its actual rated speed.” You know you have a problem when your 486-class CPU is choked by its own PCI bus.

Texas Instruments TMS9900
The TMS9900 was a notable failure for one big reason: When IBM was looking for a chip to power the original IBM PC, they had two basic options to meet their own shipping dates: the TMS9900 and the Intel 8086/8088 (the Motorola 68K was in development, but not ready in time). The TMS9900 only has a 16-bit address space, while the 8086 has 20 bits. This creates a difference between addressing 1MB and 64KB of RAM. TI also neglected to develop 16-bit peripheral chips, which left the CPU stuck with low-performance 8-bit peripherals. The TMS9900 also has no on-chip general-purpose registers; its sixteen 16-bit registers are all stored in main memory. TI had trouble finding a secondary sourcing partner, and when IBM had to choose, it chose Intel. The rest is history.

Dishonorable Mention: Qualcomm Snapdragon 810
The Snapdragon 810 is Qualcomm’s first attempt at building a big.Little CPU, based on TSMC’s short-lived 20nm process. This SoC is easily Qualcomm’s least popular high-end chip of late – Samsung skipped it entirely, while other companies ran into serious problems with the device. QC claimed that the chip’s problems were caused by poor OEM power management, but whether the problem was related to TSMC’s 20nm process, Qualcomm implementation issues, or OEM optimization, the result was the same: a hot-running chip won a precious few top designs that no one would miss.

Dishonorable mention: IBM PowerPC G5
Apple’s collaboration with IBM on the PowerPC 970 (sold by Apple as the G5) was considered a turning point for the company. When Apple released its first G5 product, it promised to launch a 3GHz chip within a year. But IBM has failed to provide components that can hit these clocks with reasonable power consumption, and the G5 cannot replace the G4 in laptops due to higher power consumption. Apple was forced to turn to Intel and x86 to launch competitive laptops and improve its desktop performance. G5 is not bad CPU, but IBM couldn’t develop a chip to compete with Intel.

Dishonorable mention: Pentium III 1.13GHz
The Coppermine Pentium III is an excellent architecture. But in the 1GHz race with AMD, Intel is eager to maintain its performance lead, even as its high-end system shipments are increasingly declining (AMD once led Intel by a 12:1 advantage when it came to actually shipping 1GHz systems). In order to regain the performance clock, Intel tried to increase the clock speed of the 180nm Cumine P3 to 1.13GHz. It failed. The chips were so unstable that Intel recalled the entire batch.

Disgraceful Mention: Cell Broadband Engine>
We’re going to put some effort into that, but we’re also going to put the Cell Broadband Engine on the pile. Cell is a great example of how a chip can be great in theory but almost impossible to exploit in practice. Sony may be using it as a general-purpose processor for the PS3, but the Cell is better at multimedia and vector processing than general-purpose workloads (its design harkens back to the days when Sony wanted to use the same processor architecture for both CPU and GPU workloads). Multi-threading a CPU to take advantage of its SPE (co-processing element) is quite difficult, and it bears little resemblance to any other architecture.
What is the worst CPU ever?
Surprisingly, it’s hard to pick the absolute worst CPU. What’s more important, a CPU that completely fails to live up to sky-high expectations (Itanium), or a CPU core that nearly kills the company that made it (Bulldozer)? Do we judge the Prescott by its heat and performance (which is terrible in both cases), or by the revenue records Intel breaks with it?
Evaluating “worst” in the broadest sense, there’s one chip that ultimately falls short of the rest in my opinion: the Cyrix MediaGX. It’s impossible not to admire the forward thinking behind this CPU. Cyrix was the first company to build what we now call a SoC, integrating PCI, audio, video and RAM controllers all on the same chip. More than 10 years before Intel or AMD introduced their own CPU+GPU configurations, Cyrix was already blazing a trail there.
Unfortunately, the trail leads directly into what locals affectionately call “Crocodile Swamp.”
Cyrix MediaGX was designed for the extreme budget market, but seemed to disappoint everyone who came across it. Performance is poor – the Cyrix MediaGX 333 has 95% of the Pentium 233 MMX’s integer performance and 76% of its FPU performance, while the CPU only runs at 70% of the clock. Integrated graphics have no video memory at all. There is no option to add off-chip L2 cache. If you found this under a tree, you would cry. If you had to work with it, you would cry. If you need to use your Cyrix MediaGX laptop to upload a program to sabotage an alien spacecraft that’s about to destroy all of humanity, you’re doomed.
All in all, this is not a great chip.
Read now: