By Justin Gural
on 3.8.12
Photo by Intel

The history of modern computing began in the 1950s with the development of reliable, discrete transistors, which were smaller, consumed less power, ran much cooler, and remained operational longer compared to the vacuum tube designs used in the first generation of computing. A true “explosion” in computing, however, came later in the 1960s during the so-called third generation of computing, thanks to a leap in innovation developed independently by two brilliant minds. This advancement was key to the evolution of personal computing and shaping the world we live in today.

To learn about how Intel changed the computing industry forever, keep reading on the next page.

In 1958, Texas Instruments employee Jack St. Clair Kilby demonstrated the first working example of the integrated circuit made using Geranium, which eventually resulted in Kilby receiving the Nobel Prize in Physics in 2000. Roughly six months after Kilby, Robert Noyce of Fairchild Semiconductor independently developed an integrated circuit of his own, which was made of silicon and addressed several practical issues that Kilby’s version did not. Ten years later, Noyce partnered with Gordon E. Moore (of “Moore’s Law” fame), to drive innovation in semiconductors by founding Integrated Electronics, or “Intel” for short. While the company was initially focused on memory chips, they soon focused attention on microprocessors by inventing the first single-chip microprocessor, the Intel 4004, in 1971.

The microprocessor was a tremendous development in the field of computing and integrated a computer’s central processing unit or CPU, on a single integrated circuit. It was designed as a programmable device that could accept digital data as input, process it based on programmable instructions stored in its memory, and then provide results accordingly. The new design greatly reduced the cost of processing power and were much more reliable, since they contained fewer electrical connections which could fail. More importantly, while the designs of microprocessors improved and got faster, their production cost generally remained constant.

Thanks to these benefits, microprocessors made smaller, low-cost microcomputers, that could be owned by individuals and small businesses, possible. Eventually, by the late-80s, thanks to this invention, Intel became the premier supplier of microprocessors for the personal computer bonanza, with IBM as their biggest client. Leveraging this success, in 1991 Intel rightfully made their mark on consumer’s minds through their line of Pentium processors.

Today, microprocessors serve as the digital brains for millions of devices around the globe and extend well beyond PCs and servers. To learn more, check out the history of Intel on their website.

Editor’s Note: You may think Intel is included in this article, simply because they helped sponsor our Breakthroughs series, but truthfully, few companies in existence have had such an impact on the development of the personal electronics as Intel. Enough said.

Written by Justin Gural. Additional contribution by Ben Bowers and Eric Yang

Inside an Intel 45nm Chip Factory

ADVENTURE IS ONE CLICK AWAY

Subscribe to GP for a daily dose of the best in gear, adventure, design, tech and culture. 5pm sharp.