Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

The Birth of the Microprocessor

The Intel 4004 is considered the first microprocessor, but its creation by Intel came down to a combination of hard work, the right timing, and just plain luck.

Intel 4004

Intel 4004

This is part of a series of posts about the circumstances leading up to the launch of the Altair 8800 in the January, 1975 issue of Popular Electronics. In my last post, I talked about the dawn of the personal computer.

The Intel 4004 is considered the first microprocessor—in other words, the first general-purpose computer on a chip—but its creation by Intel came down to a combination of hard work, the right timing, and just plain luck.

The story of the chip really begins in 1969 when a Japanese company called the Nippon Calculating Machine Corporation (but known as Busicom, after the name of its calculators) contracted with Intel to build the chips needed for a new calculator. Busicom was a relatively small calculator company losing share in a rapidly consolidating market and was in need of a new solution. And Intel was a startup, founded in 1968 with around 200 employees, mostly focusing on building memory chips.

Both needed something new.

Intel co-founder and then CEO Robert Noyce had visited Japan around late 1968, looking for customers. Noyce had a meeting with Sharp, then one of the leaders in calculators, but Sharp already had existing contracts. So Sharp's Tadashi Sasaki says he introduced Noyce to Busicom president Yoshio Kojima, and that's how Intel got the contract to build the chips for Busicom's calculator.

Marcian Edward "Ted" Hoff, who had joined Intel as employee No. 12 in 1968, was assigned to create products that would get people to switch from older core memory to Intel's new memory chips. In his telling, Intel's first custom project was to be done for a company it knew as Electro Technical Industries but that most called Busicom.

According to Masatoshi Shima, then a young engineer at Busicom but destined to become an important part of the design team, the company had plans to build a general purpose series of chips "that would be used for not only a desktop calculator, but also a business machine, such as a billing machine, cash register, and a teller machine." But Busicom didn't tell Intel this at the time, "because it was the confidential matter between Busicom and NCR Japan," so Intel thought the goal was just to build a more powerful calculator.

The initial contract was signed in April 1969, and at the end of June, Shima and two other Busicom engineers arrived at Intel. The original plan had Busicom engineers designing a series of LSI chips and Intel making the chips using its MOS (metal-oxide-semiconductor) technology. Intel was to receive $100,000 to create the chip sets and then $50 for each set made, with Busicom committing to at least 60,000 units.

Shima says his team proposed making nine kinds of LSI chips, but by most accounts this soon became a 12-chip proposal, with some of the chips requiring 3,000 to 5,000 transistors each—an enormous amount for 1969 when the standard calculator had six chips, each of which had 600 to 1,000 transistors. Hoff looked at the plans and thought the chips were too complicated to make and "that we wouldn't be able to manufacture these things to the price targets."

Hoff looked at the design and had a variety of concepts, including moving from decimal arithmetic to binary arithmetic, with using a more general-purpose chip with a simple instruction set.

Hoff thought that the Busicom plan was overly complicated and instead suggested creating a general-purpose logic chip, with much of the instructions in software stored on memory chips. As quoted in Leslie Berlin's The Man Behind the Microchip (2006, Oxford University Press), Hoff went to Intel CEO Noyce and explained his concept, which would consist of one microprocessor, two memory chips, and a shift register. "I think we can do something to simplify this," Hoff said. "I know this can be done. It can be made to emulate a computer." Although he hadn't been officially tasked with the job of designing the chips for the machine, Noyce gave him permission to keep working on the concept.

Hoff worked on the concepts over the summer, and with engineer Stanley Mazor, Hoff created a block drawing of the architecture. This would be a 4-bit binary logic chip (as opposed to Busicom's decimal design) and would store the programs for running the calculator functions on a memory chip, which was Intel's specialty at the time.

There are somewhat different recollections of how Shima and the Busicom team reacted to the concept. According to Hoff, quoted in Michael S. Malone's The Intel Trinity (2014, HarperBusiness), "So I made some proposals to the Japanese engineers to do something along these lines [the general-purpose architecture]—and they were not the least interested. They said that they recognized the design was too complicated but they were working on the simplification and they were out to design calculators and nothing else. They simply were not interested."

Busicom's Masatoshi Shima, who was running the project from Busicom's die, remembers it a bit differently. In an oral history, he said, "I felt that Hoff's proposal was good, but if we accepted Hoff's proposal as it was, we had to do the project over again from the beginning." Shima noted all the details that Hoff didn't yet have.

In August, Noyce sent a note to Busicom president Yoshia Kojima warning him that because of the complexity of Busicom's design, there was "no possibility that we could manufacture these units for $50/kit even for the simplest kit" and suggesting the actual cost would be around $300.

This was followed by a formal letter to Busicom and a meeting between the two companies in October, where Busicom decided to go with Intel's design. But it would take until February 1970 for the formal contract to be agreed upon.

Faggin's role
Busicom was expecting that Intel was working on the new plan and suggested that the company should have a substantially completed circuit diagram by the time Shima, who had returned to Japan, came for a visit on April 7, 1970. But Intel was having issues with other chips and going through a downturn in the industry and hadn't made any progress. In other words, it had the concept for the chips, including block diagrams of how the chips would need to work, but not the actual designs of the chips: the technical details of how the transistors would fit together and could be manufactured.

To lead that process, Intel hired Federico Faggin from Fairchild Semiconductor. As he describes it, he joined the company that week, and one of his first tasks was to meet Shima and explain that Intel didn't have the chips ready. "I had now this task where I am essentially six months late the day after I start," he said.

As Faggin described it in his story on the birth of the microprocessor, "I worked furiously, 12 to 16 hours a day. First, I resolved the remaining architectural issues and then I laid down the foundation of the design style that I would use for the chip set. Finally, I started the logic and circuit design and then the layout of the four chips. I had to develop a new methodology for random-logic design with silicon-gate technology; it had never been done before."

He worked closely with Shima, who was new to MOS design but had worked on LSI chips, and together they created the chips that would become the MCS-4 family. The model 4001 was a 2,048-bit ROM memory chip designed to hold the programming. The 4002 was a 320-bit RAM memory chip designed to be a cache for data. The 4003 was a 10-bit input-output register to feed the data into the main processor and remove the result. And finally, the model 4004 was a 4-bit central processing logic unit.

By all accounts, this was a herculean effort, with Faggin and Shima developing chips far faster than was normal. The various chips were all in different parts of the process at different times, and at the end of December the first versions were ready. As usual, these required some tweaking, but by March, Faggin shipped the first fully operational 4004 to Shima, who by then had returned to Japan. In the end, the 4004 was a single silicon chip that measured one-eighth by one-sixth of an inch with 2,250 individual circuit elements.

In Faggin's account, "It took a little less than one year to go from the idea to a fully working product." According to Shima, "From the general idea of Busicom, it [the development] lasted about two years and three months. And in April, 1971, finally the desktop calculator worked publically. I was so excited!"

Intel Gets Rights
In the initial contract for the chip, Busicom held the exclusive rights to the 4004. But by the spring of 1971, the calculator market was declining, and Busicom wanted to renegotiate the contract. While there were some concerns within Intel about the size of the market and the fact that Intel then was a memory company, not a process company, Faggin, Hoff, and Mazor pressured other people within the company to get back the rights to sell the chip to other customers.

As Hoff recalls, "One of the arguments I got from the marketing people was about the time I was saying, 'You should get the right to sell it,' said, 'Look, they only sell about 20,000 mini-computers over each year. And we're late to the market, and you'd be lucky to get 10 percent of it. That's 2,000 chips a year.' And they said, 'It's just not worth the headaches of support and everything for a market of only 2,000 chips.'"

Eventually Noyce got the deal signed, and Intel was legally able to sell the chip to other companies, Busicom's competitors excepted.

But the 4004 was never to find a big audience with other customers, in part because of its limitations—it was only a four-bit processor with a limited memory. While Intel formally announced the chip in an advertisement in the November 15, 1971 issue of Electronics News under the headline "A New Era in Integrated Electronics," with copy proclaiming it "a microprogrammable computer on a chip." But the industry and Intel itself was about to move on to newer and better processors.

The 8008 – Moving to 8-bit Computing
Not long after Busicom approached Intel to do chips for its calculator, Computer Terminals Corporation (CTC), later to be called Datapoint, asked Intel for a proposal for chips for a new computer terminal—a screen and keyboard designed to connect to a remote computer. Again, Hoff and Mazor proposed a microprocessor to handle the logic.

There were several big differences between the 4004 and the 8008, even though they appeared not long apart. To begin with, the 8008 was an 8-bit microprocessor, which made it big enough to work on 8 bits of data—enough for one "byte" or one character—at a time. Also, unlike the 4004, which required its own special memory chips, the 1201 was designed to use standard memory.

The project started in December of 1969 with a meeting with Andrew Grove in which Datapoint asked for chips for an 8-bit computer. According to Mazor, he made three proposals to Datapoint—two variations on an 8-bit "register stack" and "an entire 8-bit CPU on one chip." By this point, Mazor and Hoff had already been working on the Busicom project that would include the 4004. 

At around the same time, Datapoint apparently asked Texas Instruments for a similar design. In some tellings, Datapoint carried Hoff and Mazor's schematic down to Dallas where the idea began to grow into a development program in TI's semiconductor lab.

Mazor says he thinks it is very likely that TI originally proposed a multi-chip set and that then Datapoint brought the Intel proposal to TI, so TI tried to build a chip to that specification. But Mazor says the TI chip couldn't have worked because his specification had a "defect."

Intel hired Hal Feeney in March 1970 to work on the specific design of the chip, then known as the 1201, much as Faggin had worked on the 4004; and indeed, each helped on the other project. Work on the 1201 continued until mid-1970, but then Intel was concerned about whether Datapoint would actually use the chip, so work was put on hiatus, while Mazor and others worked more on the 4004.

Texas Instruments had a chip design in March 1971, which would have been a couple of months before the 4004 was working, and actually announced its chip in July 1971, several months before the 4004 announcement. But this chip apparently never shipped.

But the TI announcement spurred Intel, and in particular Grove, to redouble its efforts on the 1201. In the end, Datapoint didn't use either the Intel or TI chips. Instead, by the time Intel had completed the design, the Datapoint 2200 was introduced using conventional TTL chips.

Even if Datapoint wasn't interested, Intel was starting to see interest from other companies, such as Seiko, which wanted to build an 8-bit scientific calculator.

Around this point, Intel began to think more seriously about naming. Intel's original naming scheme was based on the different kinds of parts it was creating, so each chip in the family would have had different numbers. Faggin says he came up with the naming for the 4000 family because it was more consistent. So after the 4004 introduction, the marketing department changed the 1201 to the 8008 to reflect its being an 8-bit chip, and that's what the 8008 was called when it was introduced in April 1972. The 8008 led to Intel's big effort on microprocessor marketing and led to the creation of the Microcomputer Systems Group and the creation of development boards and systems. This, in turn, certainly helped spur the creation of a number of 8-bit devices, including some of the machines that were early microcomputers.

Who Deserves the Credit?
Over the years, there have been a lot of debates about the 4004, its place as the first microprocessor, and the credit that each of the participants deserved.

The history of integrated circuits is one of further and further integration, so the idea that you could eventually put all the features you want into a "CPU on a chip" was certainly in the air by the end of the 1960's.

Intel wasn't alone in recognizing the need for a general purpose processor, as there were too many customers who were wanting processors to design a custom chip for each of them. Later Hoff and Noyce would write "If this continued…the number of circuits needed would proliferate beyond the number of circuit designers. At the same time, the relative usage of each circuit would fall….Increased design cost and diminished usage would prevent manufacturers from amortizing costs over a large user population and would cut off the advantages of the learning curve."

"People had talked about a computer on a chip for years," Intel co-founder Gordon Moore said, "but it was always out there in the future. What Ted saw was that, with the complexity with which we were already working, you could actually make an integrated circuit like that now. That was the real conceptual breakthrough."

And even Ted Hoff has sometimes downplayed the importance of the concept. "The actual invention of the microprocessor wasn't as important as simply appreciating that there was a market for such a thing."

But there were other contenders for the title of first microprocessor. Texas Instruments actually announced a "CPU-on-a-chip" in April 1971, designed initially as a contract chip for Computer Terminal Corporation (later Datapoint). This apparently never worked, and in fact, Intel was working on a chip for CTC with the same specification; this was known as the 1201 and would eventually be renamed the 8008. Perhaps more importantly, by late 1971 Texas Instruments Engineer Gary Boone and Michael Cochrane produced the first prototype of an integrated circuit that included an input-output circuit, memory, and a central processor all in one chip, as opposed to the four-chip MCS-4 set. Known as the TMS1000, this was used initially in a TI calculator and became commercially available in 1974. Boone received a patent for his CPU in 1973, and later Boone and Cochran received a patent for a computer on a chip.

Intel's patent attorney was skeptical of making large claims and resisted Hoff's desire to patent the work as a "computer" because it would be so complicated and because others had the concept of putting a computer on a chip. According to Hoff, "he said they weren't worth it and essentially he refused at that time to write a patent." Instead, they filed more specific and more limited patents. Intel received two patents: Hoff, Mazor, and Faggin received one on "the Memory System for a Multi-Chip Digital Computer," covering the external bus organization and the memory addressing scheme of the Intel MCS-4 chip set, while Faggin received one for a circuit that could reset the CPU when the power is turned on.

Years later, inventor Gilbert Hyatt would be granted a patent on the microprocessor, which he filed in 1970, based on an invention he says he made in 1968 at his company Microcomputer Inc. But this doesn't appear to have been manufactured. Meanwhile, Fairchild, IBM, Signetics, Four-Phase, and RCA were also working on microprocessor-like devices. Still, the 4004 is almost universally considered the first microprocessor.

Among the Intel team, there were also controversies about dividing the credit. Most observers credit all four men directly involved in creating the chip set, but it isn't always that way.

Faggin was to leave Intel in late 1974, just months after the 8080 introduction, to start Zilog, taking with him Shima and other Intel engineers, and in Faggin's telling, this angered Intel's Andy Grove. Malone quotes Faggin, saying "I remember him telling me, 'You will never succeed, no matter what you are going to do. You will have nothing to tell your children and grandchildren.' Implicit in those word words was that I would have no legacy in semiconductors. That I would never be given credit for what I did at Intel. It was like he was casting a curse on me."

Whether or not it was that dramatic, it does seem like Intel gave most of the credit to Hoff, and that has continued in many histories. For instance, both T.R. Reid in The Chip (2001, Random House Trade Paperbacks) and Dirk Hansen's The New Alchemists (1983, The Book Service Ltd) give almost sole credit to Hoff, as does Grove biographer Richard Tedlow. Indeed, Malone says that from then on, Intel gave all the credit for the microprocessor to Hoff and none to Faggin until 2009 with the premiere of The Real Revolutionaries (2012, Diamond Docs, iLine Entertainment), a documentary about the founding of Silicon Valley.

But there are other histories that do point out Faggin's role (and those of Shima and Mazor, who are even more often overlooked), going back to interviews Hoff gave in the 1980s. In 1993, an Intel publication celebrating the company's 25th anniversary credited Hoff for the solution and gave him a nearly full page picture, but Faggin was recognized for turning "Hoff's vision into silicon reality." In 1996, as we were celebrating the 25th anniversary of the microprocessor at an event at Comdex, Intel helped me get in touch with all four creators, who received the PC Magazine Lifetime Achievement award.

Indeed, it seems important to credit all four men—Hoff for his vision and the basic concepts, Mazor for the programming and work on the block diagrams, Shima for creating the logic design, and Faggin for creating the impressive silicon design for the chips. Together, they created the first general purpose microprocessor, and in doing so they created the foundation not only for what would become the personal computer industry, but also for an uncountable number of other electronic devices. Literally billions of microprocessors are sold each year—all far more complex than the original 4004—and without them, our modern electronic world would be impossible.

Next: The earliest personal computers

For more information, see Andy Grove: The Life and Times of an American by Richard S. Tedlow (2006, Portfolio Hardcover), "The Birth of the Microprocessor" by Federico Faggin, The Chip by T.R. Reid (2001, Random House Trade Paperbacks), "Defining Intel: 25 years, 25 Events," (1993, Intel Corporation), A History of Modern Computing by Paul E. Ceruzzi (2003, The MIT Press), Inside Intel by Tim Jackson (1997, Harper Collins), The Intel Trinity by Michael S. Malone (2014, HarperBusiness), The Man Behind the Microchip by Leslie Berlin (2006, Oxford University Press), Microchip by Jeffrey Zygmont (2002, Basic Books), The New Alchemists by Dirk Hanson (1983, The Book Service Ltd), "Oral History on the Development and Promotion of the Intel 4004 Microprocessor," Computer History Museum; "Oral History on the Development and Promotion of the Intel 8008 Microprocessor," Computer History Museum, The Real Revolutionaries (2012, Diamond Docs, iLine Entertainment).

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Michael J. Miller

Former Editor in Chief

Michael J. Miller is chief information officer at Ziff Brothers Investments, a private investment firm. From 1991 to 2005, Miller was editor-in-chief of PC Magazine,responsible for the editorial direction, quality, and presentation of the world's largest computer publication. No investment advice is offered in this column. All duties are disclaimed. Miller works separately for a private investment firm which may at any time invest in companies whose products are discussed, and no disclosure of securities transactions will be made.

Read Michael J.'s full bio

Read the latest from Michael J. Miller