This site may earn affiliate commissions from the links on this folio. Terms of utilise.

Last week, we covered two key developments regarding AMD and Intel. First, AMD chose to motion its GPUs back to 7nm at TSMC, raising the question of what this might say about GlobalFoundries ain 7nm ramp, particularly given that GF recently replaced its CEO for the quaternary time in a decade and has had well-known and repeated problems ramping cutting-border process nodes. So Intel declared it would button back its own 10nm ramp from 2022 to 2022 — and not even the first half of 2022, simply "2019." Neither 1 of these things is great for either company, though the fine folks at AMD would undoubtedly annotation that one of them is speculation and the other is at present established fact. And of course, they'd be correct nigh that.

Only there'southward a larger betoken here — 1 that'southward easy to forget, given how long it's been since AMD was slugging it out with Intel for the height of the CPU market place. From mid-2006 to 2022 (the launch of Bulldozer), AMD fought a rear-guard action to accept on Intel's Core ii Duo and Core 2 Quad, and it did so reasonably well. You take to become dorsum more than 12 years to find a period of time when AMD was solidly in the driver's seat or competing for the same, and information technology'south therefore easy to forget the way things used to look. Cast your eyes back farther, and some of the issues we're seeing in the CPU industry volition look a little more familiar.

The Way We Were

From 1996-2006, the CPU market could practically have doubled as a soap opera. CPUs similar the Pentium MMX and Pentium II were supposed to trounce every non-Intel x86 manufacturer on the planet. This largely worked — over several years, Intel ate into what little market share companies like WinChip and Cyrix were able to build for themselves. AMD, however, was the problems that refused to be squished. In 1995, AMD bought the struggling CPU designer NexGen and brought that visitor's CPU to market as the K6, later extended as the K6-two and K6-3.

Prior to the launch of the K7 Athlon, these cores were picayune more than an badgerer to Intel — AMD'southward K6 and K6-two offered gaming and FPU performance that was simply skilful enough to skate by as the poor human being's gaming system, particularly since they leaned on the older Socket 7 (later extended into Super Socket 7) technology. Upwardly until Baronial 1999, the just real harm Intel had taken from its x86 upstarts was being forced into releasing a Celeron with an L2 enshroud (Mendocino) every bit opposed to the terrible, crippled variant with just L1 that they'd tried launching first (Deschutes). But the debut of the original K7 Athlon changed everything. Before K7, AMD was the "good enough" culling you could reasonably game on.

A full rehash of the product launches and misadventures that hit both companies would fill up several articles, but hither'southward a short list.

AMD

AMD found itself with a fabulous CPU and terrible chipsets. The chipset trouble went on for years, with VIA cheerfully willing to rely on the infamous VT82C686B, despite knowing that information technology contained a software RAID problems that would literally permanently corrupt data on your hard drives if you used them in a RAID array while simultaneously using a SoundBlaster — and back and then, nearly everyone used a SoundBlaster. Nvidia'southward entrance into the chipset market was generally hailed past everyone who always got bitten by VIA'due south habit of selling a new chipset… simply to sell you the real version of the chipset that y'all should have bought 6-12 months later, labeled with an "A" to distinguish it. Everyone was pretty fine with this at first (KT133, KT133A), simply round about the time we got the KT266A (aka "The 1 with DDR performance that was actually amend than using SDR RAM") folks were existent sick of that particular habit.

Incidentally, VIA's PC business more often than not died when it decided that it could accept Intel on over needing a charabanc license for the Pentium iv. VIA, which held roughly a tertiary of Intel'due south Pentium iii chipset business organization, declared information technology didn't need a license. Intel said information technology did. VIA said it didn't — and all of the various motherboard OEMs dropped VIA chipsets similar a bad addiction when Intel told them to. VIA'southward Intel business organization never recovered from the Pentium 4 debacle and its AMD business concern died in the K8 era, when AMD users flocked to Nvidia chipsets instead.

AMD enjoyed a mostly strong run from 1999-2001, but the 130nm shrink of the Pentium 4 transformed that CPU from turd to titan. Intel raced upwardly the clock charts from early 2002 to mid-2003, leaping from 2GHz to 3.06GHz, increasing its FSB clock, and hammering the Thunderbird and later Palomino CPU cores. AMD's first stab at 130nm fell completely flat and there seemed to be petty hope left for the company to mountain a defense against the P4, until it turned things around just two months subsequently with the launch of a 2nd 130nm die shrink — this time with an extra metal layer and a few hundred MHz of boosted clock. The Thoroughbred "B" core didn't put AMD back on top, but helped the company fight a rear-guard action against the P4 (which was, incidentally, eating its lunch) long enough to launch the Athlon 64 in 2003.

AMD K8 architecture

AMD K8 compages

Even then, Intel was far from finished. It'due south easy to remember Prescott every bit being a terrible bit from Day 1, but when the chip debuted on launch solar day there plenty of people who idea Intel would plow the cadre around when it shifted from Socket 478 to LGA 775. Information technology didn't — and that represented a tremendous opportunity for AMD, who took reward of these shifts to win new markets in servers and to aggressively bring dual-cadre parts to market. AMD's Athlon 64 and 64 X2 had their golden age from 2004 to mid-2006, but it took AMD years to lay the ground for those successes, and to dodge challenges that ranged from the force of its motherboard partners to its ain fab and foundry difficulties. With the exception of the 180nm shift, which it made with IBM assistance, AMD was often half a node to a node behind Intel (the gap between the 2 companies lengthened as time went on), and information technology often seemed to get fewer benefits from node shifts as well for a variety of reasons.

Intel

Intel is genuinely difficult to summarize throughout this same period. From 1996-2006, information technology attempted to take over the RAM market place (and failed), had to recollect the Pentium three 1.13GHz, had to recollect the unabridged i820 with Memory Translator Hub motherboard family unit, weathered a tempest of criticism concerning its new CPU architecture, fought off withering reviews of that same architecture, launched a new 130nm variant of said Netburst uarch (Northwood) that laughed at all the haters as it skyrocketed from 2GHz to iii.06GHz in less than xviii months, so watched Northwood'due south successor explode like the Russian taiga mail-Tunguska after information technology shrank down to 90nm.

P4 Prescott LGA

Prescott ran and so hot, it melted plastic stands beneath test motherboards and acquired a Prescott-compatible pocket-size class factor arrangement sent to my former employer to *ignite.* No exaggeration.

Seriously, though. By January 1, 2006 Intel looked pretty damn cooked. AMD'due south Athlon 64 X2 family had knocked the Pentium 4 off enthusiast radars. The advent of dual cores had made AMD the superior solution for workstations and Opteron would hitting ~20 pct of the server market that same year. But even allowing for the impact Intel's market place manipulations had on AMD'due south overall success, in that location were some substantial land mines in AMD's path likewise — the company had paid twice what information technology should've for ATI, costing it billions of dollars. Its Phenom compages would fail to friction match Intel's Core 2 Duo, and while AMD'southward Phenom 2 was a pretty good CPU, it wasn't good enough to catch Nehalem. Whatsoever else i might think of Intel, Santa Clara didn't put a TLB error in Phenom, delay Bulldozer, repeatedly push back the launch of AMD's "Fusion" processors, or build a CPU that's often been called "AMD'due south Pentium iv."

And what fifty-fifty AMD wasn't paying enough attention to, back in January, 2006 was that Intel had been quietly evolving its mobile Pentium Yard architecture for several years, quietly assembling the pieces of a puzzle that would lead to its domination of the CPU market for nearly a decade. I still have a Tualatin Pentium 3 CPU — a 130nm dice compress of the 180nm P3 that was beaten down by Athlon and Thunderbird earlier existence replaced by the mostly inferior Pentium iv Willamette. In 2001, the P3 Tualatin was the all-time overall CPU core Intel had that information technology didn't desire to sell because Netburst was supposed to exist the future. In 2008, Nehalem — Tualatin'southward descendant — launched a ix-year period of Intel authorisation.

The Moral of the Story

In the late 1990s, prior to the launch of Athlon, few gave much chance for AMD to survive. On January 1, 2006, in that location were many in the enthusiast community who thought Intel's market share would go along to collapse. Today, it's common to see people suggest that ARM or Samsung or TSMC (or some combination of all of the above) will be the end of Intel. And some autumn back to the old AMD / Intel argument. Onetime fandoms die difficult, and with Intel and AMD'southward common focus on the PC space, it's easy to frame the current fight as just another iteration of the same battle the two have waged before.

The advantage of stepping back and taking the longer view is that it sets some of these episodes in context. The computing world is non the same equally it was in 2006 or 1996, just Intel's 10nm filibuster will not, on balance, particularly cripple the visitor — though information technology may certainly create atmospheric condition that favor Intel's competitors more than Intel itself. And AMD, having weathered disasters much larger than a hypothetical 7nm delay, is unlikely to be crippled here, fifty-fifty if I'm right that there are reasons to be concerned.

After over a decade of stasis, the topsy-turvy, take-no-prisoners resumption of meaningful competition can look more climactic than information technology is. This type of struggle, where both companies push button to bring new products to market, fight with uncooperative technologies, and fend off shots from 1 some other used to be a lot more than common than information technology has been of belatedly. It's not the sign of the end for either visitor — it'southward a chip of competitive limbering-up after a very long freeze.

Whether or not either company can continue delivering meaningful year-on-year improvements in the confront of declining node compress values and the difficulties of improving single-threaded functioning is a very expert question — just once more, across the scope of this novella commodity. The bespeak is, when two companies are actually pushing each other, the end issue isn't a perfect prepare of improvements delivered on a precise cadence year after yr. Both times AMD actually managed to slug information technology out with Intel beyond the entire CPU stack, the result was a glorious train wreck of launches, moves, counter-moves, face-plants, abject failures, and occasional soaring successes. It was, in a word, interesting. And it looks similar things might exist getting more than interesting over the next few years.