The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolutio - Isaacson Walter. Страница 107

Justin Hall (1974– ) and Howard Rheingold (1947– ) in 1995.

CHAPTER ELEVEN

THE WEB

There was a limit to how popular the Internet could be, at least among ordinary computer users, even after the advent of modems and the rise of online services made it possible for almost anyone to get connected. It was a murky jungle with no maps, filled with clusters of weird foliage with names like alt.config and Wide Area Information Servers that could intimidate all but the most intrepid pathfinder.

But just when the online services began opening up to the Internet in the early 1990s, a new method of posting and finding content miraculously appeared, as if it had burst into life from an underground atom smasher, which in fact was close to what happened. It made the carefully packaged online services obsolete, and it fulfilled—indeed far surpassed—the utopian dreams of Bush, Licklider, and Engelbart. More than most innovations of the digital age it was invented primarily by one man, who gave it a name that managed to be, as he was personally, both expansive and simple: the World Wide Web.

TIM BERNERS-LEE

As a kid growing up on the edge of London in the 1960s, Tim Berners-Lee came to a fundamental insight about computers: they were very good at crunching step by step through programs, but they were not very good at making random associations and clever links, the way that an imaginative human could.

This is not something that most kids ponder, but both of Berners-Lee’s parents were computer scientists. They worked as programmers on the Ferranti Mark I, the commercial version of the Manchester University stored-program computer. One evening at home his father, who had been asked by his boss to draft a speech on how to make computers more intuitive, talked about some books on the human brain that he was reading. His son recalled, “The idea stayed with me that computers could become much more powerful if they could be programmed to link otherwise unconnected information.”1 They also talked about Alan Turing’s concept of a universal machine. “It made me realize that the limitations on what you could do with a computer were just the limitations of your imagination.”2

Berners-Lee was born in 1955, the same year as Bill Gates and Steve Jobs, and he considered it a lucky time to be interested in electronics. Kids of that era found it easy to get hold of basic equipment and components that they could play with. “Things came along at the right time,” he explained. “Anytime we understood one technology, then industry produced something more powerful that we could afford with our pocket money.”3

In primary school, Berners-Lee and a friend hung around hobby shops, where they used their allowance to buy electromagnets and make their own relays and switches. “You’d have an electromagnet banged into a bit of wood,” he recalled. “When you turned it on, it would attract a bit of tin and that would complete a circuit.” From that they developed a deep understanding of what a bit was, how it could be stored, and the things that could be done with a circuit. Just when they were outgrowing simple electromagnetic switches, transistors became common enough that he and his friends could buy a bag of a hundred pretty cheaply. “We learned how to test transistors and use them to replace the relays we had built.”4 In doing so, he could visualize clearly what each component was doing by comparing them to the old electromagnetic switches they superseded. He used them to make audio sounds for his train set and to create circuits that controlled when the train should slow down.

“We began to imagine quite complicated logical circuits, but those became impractical because you would have to use too many transistors,” he said. But just as he ran into that problem, microchips became available at the local electronics store. “You buy these little bags of microchips with your pocket money and you’d realize you could make the core of a computer.”5 Not only that, but you could understand the core of the computer because you had progressed from simple switches to transistors to microchips and knew how each worked.

One summer just before he went off to Oxford, Berners-Lee had a job in a lumber yard. When he was dumping a pile of sawdust into a Dumpster, he spied an old calculator, partly mechanical and partly electronic, with rows of buttons. He salvaged it, wired it up with some of his switches and transistors, and soon had it working as a rudimentary computer. At a repair shop he bought a broken television set and used the monitor to serve as a display, after figuring out how the circuit of vacuum tubes worked.6

During his Oxford years, microprocessors became available. So, just as Wozniak and Jobs had done, he and his friends designed boards that they tried to sell. They were not as successful as the Steves, partly because, as Berners-Lee later said, “we didn’t have the same ripe community and cultural mix around us like there was at the Homebrew and in Silicon Valley.”7 Innovation emerges in places with the right primordial soup, which was true of the Bay Area but not of Oxfordshire in the 1970s.

His step-by-step hands-on education, starting with electromagnetic switches and progressing to microprocessors, gave him a deep understanding of electronics. “Once you’ve made something with wire and nails, when someone says a chip or circuit has a relay you feel confident using it because you know you could make one,” he said. “Now kids get a MacBook and regard it as an appliance. They treat it like a refrigerator and expect it to be filled with good things, but they don’t know how it works. They don’t fully understand what I knew, and my parents knew, which was what you could do with a computer was limited only by your imagination.”8

There was a second childhood memory that lingered: that of a Victorian-era almanac and advice book in his family home with the magical and musty title Enquire Within Upon Everything. The introduction proclaimed, “Whether You Wish to Model a Flower in Wax; to Study the Rules of Etiquette; to Serve a Relish for Breakfast or Supper; to Plan a Dinner for a Large Party or a Small One; to Cure a Headache; to Make a Will; to Get Married; to Bury a Relative; Whatever You May Wish to Do, Make, or to Enjoy, Provided Your Desire has Relation to the Necessities of Domestic Life, I Hope You will not Fail to ‘Enquire Within.’?”9 It was, in some ways, the Whole Earth Catalog of the nineteenth century, and it was filled with random information and connections, all well indexed. “Enquirers are referred to the index at the end,” the title page instructed. By 1894 it had gone through eighty-nine editions and sold 1,188,000 copies. “The book served as a portal to a world of information, everything from how to remove clothing stains to tips on investing money,” Berners-Lee observed. “Not a perfect analogy for the Web, but a primitive starting point.”10

Another concept that Berners-Lee had been chewing on since childhood was how the human brain makes random associations—the smell of coffee conjures up the dress a friend wore when you last had coffee with her—whereas a machine can make only the associations that it has been programmed to make. He was also interested in how people work together. “You got half the solution in your brain, and I got half in my brain,” he explained. “If we are sitting around a table, I’ll start a sentence and you might help finish it, and that’s the way we all brainstorm. Scribble stuff on whiteboard, and we edit each other’s stuff. How can we do that when we are separated?”11