The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolutio - Isaacson Walter. Страница 49

When Kilby was told that he had won the Nobel Prize in 2000, ten years after Noyce had died,I among the first things he did was praise Noyce. “I’m sorry he’s not still alive,” he told reporters. “If he were, I suspect we’d share this prize.” When a Swedish physicist introduced him at the ceremony by saying that his invention had launched the global Digital Revolution, Kilby displayed his awshucks humility. “When I hear that kind of thing,” he responded, “it reminds me of what the beaver told the rabbit as they stood at the base of Hoover Dam: ‘No, I didn’t build it myself, but it’s based on an idea of mine.’?”18

MICROCHIPS BLAST OFF

The first major market for microchips was the military. In 1962 the Strategic Air Command designed a new land-based missile, the Minuteman II, that would each require two thousand microchips just for its onboard guidance system. Texas Instruments won the right to be the primary supplier. By 1965 seven Minutemen were being built each week, and the Navy was also buying microchips for its submarine-launched missile, the Polaris. With a coordinated astuteness not often found among military procurement bureaucracies, the designs of the microchips were standardized. Westinghouse and RCA began supplying them as well. So the price soon plummeted, until microchips were cost-effective for consumer products and not just missiles.

Fairchild also sold chips to weapons makers, but it was more cautious than its competitors about working with the military. In the traditional military relationship, a contractor worked hand in glove with uniformed officers, who not only managed procurement but also dictated and fiddled with design. Noyce believed such partnerships stifled innovation: “The direction of the research was being determined by people less competent in seeing where it ought to go.”19 He insisted that Fairchild fund the development of its chips using its own money so that it kept control of the process. If the product was good, he believed, military contractors would buy it. And they did.

America’s civilian space program was the next big booster for microchip production. In May 1961 President John F. Kennedy declared, “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” The Apollo program, as it became known, needed a guidance computer that could fit into a nose cone. So it was designed from scratch to use the most powerful microchips that could be made. The seventy-five Apollo Guidance Computers that were built ended up containing five thousand microchips apiece, all identical, and Fairchild landed the contract to supply them. The program beat Kennedy’s deadline by just a few months; in July 1969 Neil Armstrong set foot on the moon. By that time the Apollo program had bought more than a million microchips.

These massive and predictable sources of demand from the government caused the price of each microchip to fall rapidly. The first prototype chip for the Apollo Guidance Computer cost $1,000. By the time they were being put into regular production, each cost $20. The average price for each microchip in the Minuteman missile was $50 in 1962; by 1968 it was $2. Thus was launched the market for putting microchips in devices for ordinary consumers.20

The first consumer devices to use microchips were hearing aids because they needed to be very small and would sell even if they were rather expensive. But the demand for them was limited. So Pat Haggerty, the president of Texas Instruments, repeated a gambit that had served him in the past. One aspect of innovation is inventing new devices; another is inventing popular ways to use these devices. Haggerty and his company were good at both. Eleven years after he had created a huge market for inexpensive transistors by pushing pocket radios, he looked for a way to do the same for microchips. The idea he hit upon was pocket calculators.

On a plane ride with Jack Kilby, Haggerty sketched out his idea and handed Kilby his marching orders: Build a handheld calculator that can do the same tasks as the thousand-dollar clunkers that sit on office desks. Make it efficient enough to run on batteries, small enough to put into a shirt pocket, and cheap enough to buy on impulse. In 1967 Kilby and his team produced almost what Haggerty envisioned. It could do only four tasks (add, subtract, multiply, and divide) and was a bit heavy (more than two pounds) and not very cheap ($150).21 But it was a huge success. A new market had been created for a device people had not known they needed. And following the inevitable trajectory, it kept getting smaller, more powerful, and cheaper. By 1972 the price of a pocket calculator had dropped to $100, and 5 million units were sold. By 1975 the price was down to $25, and sales were doubling every year. In 2014 a Texas Instruments pocket calculator cost $3.62 at Walmart.

MOORE’S LAW

That became the pattern for electronic devices. Every year things got smaller, cheaper, faster, more powerful. This was especially true—and important—because two industries were growing up simultaneously, and they were intertwined: the computer and the microchip. “The synergy between a new component and a new application generated an explosive growth for both,” Noyce later wrote.22 The same synergy had happened a half century earlier when the oil industry grew in tandem with the auto industry. There was a key lesson for innovation: Understand which industries are symbiotic so that you can capitalize on how they will spur each other on.

If someone could provide a pithy and accurate rule for predicting the trend lines, it would help entrepreneurs and venture capitalists to apply this lesson. Fortunately, Gordon Moore stepped forward at that moment to do so. Just as the microchip sales were starting to skyrocket, he was asked to forecast the future market. His paper, titled “Cramming More Components onto Integrated Circuits,” was published in the April 1965 issue of Electronics magazine.

Moore began with a glimpse of the digital future. “Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment,” he wrote. Then he produced an even more prescient prediction that was destined to make him famous. “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year,” he noted. “There is no reason to believe it will not remain nearly constant for at least ten years.”23

Roughly translated, he was saying that the number of transistors that could be crammed, cost-effectively, onto a microchip had been doubling every year, and he expected it to do so for at least the next ten years. One of his friends, a professor at Caltech, publicly dubbed this “Moore’s Law.” In 1975, when the ten years had passed, Moore was proved right. He then modified his law by cutting the predicted rate of increase by half, prophesying that the future numbers of transistors crammed onto a chip would show “a doubling every two years, rather than every year.” A colleague, David House, offered a further modification, now sometimes used, which said chip “performance” would double every eighteen months because of the increased power as well as the increased numbers of transistors that would be put onto a microchip. Moore’s formulation and its variations proved to be useful at least through the subsequent half century, and it helped chart the course for one of the greatest bursts of innovation and wealth creation in human history.

Moore’s Law became more than just a prediction. It was also a goal for the industry, which made it partly self-fulfilling. The first such example occurred in 1964, as Moore was formulating his law. Noyce decided that Fairchild would sell its simplest microchips for less than they cost to make. Moore called the strategy “Bob’s unheralded contribution to the semiconductor industry.” Noyce knew that the low price would cause device makers to incorporate microchips into their new products. He also knew that the low price would stimulate demand, high-volume production, and economies of scale, which would turn Moore’s Law into a reality.24