Birth of a Standard: The Intel 8086 Microprocessor
Article by Benj Edwards
PC World, June 16, 2008

Stephen P. Morse, San Francisco



Birth of a Standard: The Intel 8086 Microprocessor

Thirty years ago, Intel released the 8086 processor, introducing the x86 architecture that underlies every PC--Windows, Mac, or Linux--produced today.

Benj Edwards, PC World

Jun 16, 2008 10:00 pm

The release of Intel's 8086 microprocessor in 1978 was a watershed moment for personal computing. The DNA of that chip is likely at the center of whatever computer--Windows, Mac, or Linux--you're using to read this, and it helped transform Intel from merely one of many chip companies to the world's largest.

Intel's 8086 microprocessorWhat's most surprising about the tremendous success of the 8086, though, is how little people expected of it when it was first conceived. The history of this revolutionary processor is a classic tale of how much a small team of bright engineers can accomplish when they're given the freedom to do their jobs in innovative ways.

When development of the 8086 began in May 1976, Intel executives never imagined its spectacular impact. They saw it as a minor stopgap project. They were pinning the company's hopes on a radically different and more sophisticated processor called the 8800 (later released as the iAPX 432). In an era when most chips still used 8-bit data paths, the 8800 would leapfrog all the way up to 32 bits. Its advanced multitasking capabilities and memory-management circuitry would be built right into the CPU, allowing operating systems to run with much less program code.

But the 8800 project was in trouble. It had encountered numerous delays as Intel engineers found that the complex design was difficult to implement with then-current chip technology. And Intel's problems didn't stop there--it was being outflanked by Zilog, a company started by former Intel engineers. Zilog had quickly captured the midrange microprocessor market with its Z80 CPU. Released in July 1976, it was an enhanced clone of Intel's successful 8080--the processor that had effectively launched the personal-computer revolution. Intel had yet to come up with an answer to the Z80.

Enter the Architect

Stephen Morse

Former Intel engineer Stephen Morse was the architect of the 8086's underlying code.
Intel execs maintained their faith in the 8800, but knew they needed to respond to Zilog's threat somehow. They turned to Stephen Morse, a 36-year-old electrical engineer who had impressed them with a critical examination of the 8800 processor's design flaws. The company's upper brass picked Morse as the sole designer for the 8086. "If [Intel] management had any inkling that this architecture would live on through many generations and into today's ... processors," recalls Morse, "they never would have trusted this task to a single person." (For more, see our in-depth interview with Morse.)

Picking Morse was surprising for another reason: He was a software engineer. Previously, CPU design at Intel had been the domain of hardware engineers alone. "For the first time, we were going to look at processor features from a software perspective," says Morse. "The question was not 'What features do we have space for?' but 'What features do we want in order to make the software more efficient?'" That software-centric approach proved revolutionary in the industry.

Although the 8086 was Morse's pet project, he didn't work alone. Joining Morse's team were other Intel employees, including Bill Pohlman, Jim McKevitt, and Bruce Ravenel, all of whom were essential in bringing the 8086 to market in the summer of 1978.

Beyond laying down some basic requirements--that the 8086 be compatible with software written for the popular 8080 chip and that it be able to address 128KB of memory--Intel leadership stayed out of Morse's way. "Because nobody expected the design to live long, no barriers were placed in my way, and I was free to do what I wanted," he says.

Lackluster Release

Upon its release, Morse's creation hardly took the computing world by storm. The midrange personal-computer market was saturated with cookie-cutter business machines based on the Z80 and running CP/M, the OS du jour of the late 1970s. The 8086 first appeared in a few unremarkable PCs and terminals. It gained a bit of a foothold in the portable computer market (in the form of the 80C86). Eventually it found acceptance in the microcontroller and embedded-applications market, most notably in the NASA Space Shuttle program, which uses 8086 chips to control diagnostic tests on its solid-rocket boosters to this day. (The space agency buys electronic relics on eBay to scavenge for the processors.)

In March 1979, Morse left Intel. Then a series of seemingly unremarkable events conspired to make the 8086 an industry standard.

Intel's 8088 microprocessorA few weeks after Morse's departure, Intel released the 8088, which Morse calls "a castrated version of the 8086" because it used an adulterated version of the 8086's 16-bit capability. Since many systems were still 8-bit, the 8088 sent out the 16-bit data in two 8-bit cycles, making it compatible with 8-bit systems.

Two years later, IBM began work on the model 5150, the company's first PC to consist only of low-cost, off-the-shelf parts. It was a novel concept for IBM, which previously emphasized its proprietary technology to the exclusion of all others.

Obviously, an off-the-shelf system demanded an off-the-shelf microprocessor. But which to choose? IBM decided early on that its new machine required a 16-bit processor, and narrowed the choices down to three candidates: the Motorola 68000 (the powerful 16-bit processor at the heart of the first Macintosh), the Intel 8086, and its "castrated" cousin, the Intel 8088.

According to David J. Bradley, an original member of the IBM development team, the company eliminated the Motorola chip from consideration because IBM was more familiar and comfortable with Intel processors. Tipping the scales was the fact that Microsoft had a ready and working BASIC interpreter available for the 8086 and, since it shared the same base code, the 8088.

IBM then had to choose between the 8086 and the 8088. Ultimately, the decision came down to the simple economics of reducing chip count. IBM selected the 8088, a decision that allowed the company to build cheaper machines because it could use fewer ROM modules and less RAM, Bradley says.

In a sense, though, it didn't matter which of the Intel chips IBM chose. Both were built on the same underlying 8086 code written by Stephen Morse.

From Chip to Standard

How did the 8086 code become an industry standard? The answer is wrapped up in the important role of IBM's 5150 itself. (The 5150 is number 6 on our list of the 25 Greatest PCs of All Time.) The PC industry in the early 1980s was a little like Eastern Europe after the fall of the Soviet Union--lots of fractured republics all headed in different directions. Dozens of different computer platforms were available from just as many manufacturers. Incompatibilities among computer systems constantly frustrated users, who longed to use software, hardware, and peripherals from one machine on another.

Gradually, however, the disparate parts of the PC universe fell into orbit around the 5150. One big reason for its success was the IBM name on the box. The brand had more cachet among business buyers than rival companies such as Radio Shack or Apple. The question of the day was, "Do you want to buy a computer from International Business Machines or from a company named after a fruit?" Bradley says. 

And because IBM had used off-the-shelf components, other companies could produce clones--and clone they did.

With the IBM PC quickly becoming dominant, Intel capitalized on the trend by developing improved versions of the 8086 over the years, starting with the 80186 and then progressing to the 80286, 80386, 80486, Pentium, and so on, up to the present. Thanks to the common end-numerals in most of those CPU designations, the line became known as "x86," even after Intel switched to trademark-eligible names such as Pentium, Celeron, and Centrino. Other CPU manufacturers soon joined the Intel bandwagon, with companies such as AMD, Cyrix, NEC, and even IBM releasing their own x86-compatible processors, further cementing x86 as a PC standard.

Right Place, Right Time

According to Morse and Bradley, our current x86-dependence mostly came down to chance. "I was just lucky enough to have been at the right place at the right time," says Morse. "Any bright engineer could have designed the processor. It would probably have had a radically different instruction set, but all PCs today would be based on that architecture instead." In a similar vein, IBM veteran Bradley jokes, "If IBM had chosen the Motorola 68000 for the IBM PC (as some wanted), we would have had the WinOla duopoly rather than the Wintel duopoly."

The true power of x86 lies not in the particular operation codes that make our CPUs run, but in the momentum of common computer standards. The 8086 paved the way for rapid, exponential progress in computer speed, capacity, and price-performance--all driven by fierce competition among hundreds of companies vying to improve the same thing.

Morse's humble 8086 instruction set still lies at the heart of nearly every modern PC CPU, from the Opteron to the Athlon to the Core 2 Quad. For a practical demonstration of just how powerful the x86 standard is, consider this: Any assembly-language program written as far back as 1978 for the Intel 8086 microprocessor will run, unmodified, on Intel's latest Core 2 Extreme CPU--just 180,000 times faster.