PREFACE
There are, generally speaking, two popular accounts of the invention of personal computing.
The first roots the PC in the exploits of a pair of young computer hobbyists-turned-entrepreneurs, Stephen Wozniak and Steven Jobs. Wozniak, the story goes, built a computer to share with his friends at the Homebrew Computer Club, a ragtag group that began meeting on the San Francisco Midpeninsula in the spring of 1975. His high school friend, Steve Jobs, had the foresight to see that there might be a consumer market for such a machine, and so they went on to found Apple Computer in 1976.
The second account locates the birthplace of personal computing at Xerox's fabled Palo Alto Research Center in the early 1970s. There, the giant copier company assembled a group of the nation's best computer scientists and gave them enough freedom to conceive of information tools for the office of the future. Out of that remarkable collection of talent came a computer called the Alto, the forerunner of today's desktops and portables. Although Xerox is reputed to have "fumbled the future" by not commercializing the device successfully, the dozens of spin-offs that resulted from PARC became the basis for one of Silicon Valley's most oft-told fables: that in 1979 Jobs visited PARC and took away with him the idea of the graphical user interface.
Both stories are true, yet they are both incomplete.
This book is about what came before, about the extraordinary convergence of politics, culture, and technology that took place in a period of less than two decades and within the space of just a few square miles. Out of that convergence came a remarkable idea: personal computing, the notion that one person should control all of the functions of a computer and that the machine would in turn respond as an idea amplifier. By the late 1960s, that idea was already in the air on the San Francisco Midpeninsula.
Before the arrival of the Xerox scientists and the Homebrew hobbyists, the technologies underlying personal computing were being pursued at two government-funded research laboratories located on opposite sides of Stanford University. The two labs had been founded during the sixties, based on fundamentally different philosophies: Douglas Engelbart's Augmented Human Intellect Research Center at Stanford Research Institute was dedicated to the concept that powerful computing machines would be able to substantially increase the power of the human mind. In contrast, John McCarthy's Stanford Artificial Intelligence Laboratory began with the goal of creating a simulated human intelligence.
One group worked to augment the human mind; the other to replace it.
Although the two groups had little direct contact during the sixties, within each lab was a handful of researchers and engineers who early on understood a fundamental truth about the microelectronics industry then taking root in Santa Clara Valley: Unlike with any previous technologies, the very nature of the silicon chip would inexorably lead to an increase in the power of computing. Moreover, as the transistors etched onto silicon wafers shrank in size, the pace of the process would accelerate.
For each reduction of the size of transistors by half, the area for circuits on a chip quadrupled. Computer speed and capacity would continue to increase while costs fell and the size of computers shrank. It was a straightforward insight, but for those who made the leap it was the mind-expanding equivalent of taking a psychedelic drug.
In 1965, Intel cofounder Gordon Moore noted the phenomenon, which was later known as Moore's Law and which became Silicon Valley's defining principle. By the 1980s and 1990s, Moore's Law had emerged as the underlying assumption that governed almost everything in the Valley, from technology to business, education, and even culture. The "law" said the number of transistors would double every couple of years. It dictated that nothing stays the same for more than a moment; no technology is safe from its successor; costs fall and computing power increases not at a constant rate but exponentially: If you're not running on what became known as "Internet time," you're falling behind.
Although Moore received the intellectual credit for the paradigm, his law had actually been uncovered some years earlier by a handful of computing pioneers who were among the first to contemplate the new semiconductor-manufacturing technology based on photolithographic printing of transistors and logic circuits on the surface of silicon wafers. At the beginning of the 1960s, a small group of computer designers and engineers working with integrated circuits had realized that the technology held stunning economic implications, and not just for moon shots and nuclear-tipped missiles. As semiconductor-manufacturing capabilities were refined, it became apparent that computing, then in the hands of just a few, would eventually be available to everyone.
To these pioneers, the trajectory was obvious. As a result, while the early machines used by researchers at the Stanford laboratories were neither desktop-size nor personal, the central ideas of interactivity and individual control quickly became ingrained in everything they designed.
The idea of personal computing was born in the sixties; only later, when falling costs and advancements in technology made it feasible, would the box itself arrive.
The engineers' insight did not take place in a vacuum, however. The shrinking silicon chip did not emerge in isolation from the surrounding world but grew out of the twin geopolitical challenges of placing a man on the moon and squeezing navigational circuitry into the nosecone of an ICBM. Today, this is hard to appreciate, particularly because the pace of the semiconductor industry has made progress seem almost mechanistic as each new generation of chips arrives like clockwork. In a similar fashion, the two Stanford laboratories came into existence in a remarkable place during an extraordinary time. The San Francisco Midpeninsula during the sixties and early seventies witnessed an epochal intersection of science, politics, art, and commerce, a convergence comparable to that at such landmark places in history as Vienna after World War I.
Beginning in the fifties, the computer had come under attack as a symbol of large, centralized, bureaucratic institutions. Lewis Mum-ford, writing in The Myth of the Machine: The Pentagon of Power, asserted that the electronic computer had been created in opposition to human freedom and denounced the computer technicians who worked at creating superhuman machines. In the course of a single decade, however, that worldview changed. Computing went from being dismissed as a tool of bureaucratic control to being embraced as a symbol of individual expression and liberation. The evolution of the perception of the computer mirrored other changes in the world at large.
By the end of the 1960s, the United States had been transformed by a broad political and social upheaval that stripped away the comfortable middle-class veneer of the previous decade. The civil rights, psychedelic, women's rights, ecology, and antiwar movements all contributed to the emergence of a counterculture that rejected many of America's cherished postwar ideals. The computer technologies that we take for granted today owe their shape to this unruly period, which was defined by protest, experimentation with drugs, counter-cultural community, and a general sense of anarchic idealism.
Stewart Brand has argued in his essay "We Owe It All to the Hippies" that "the counterculture's scorn for centralized authority provided the philosophical foundations of not only the leaderless Internet but also the entire personal-computer revolution."1 Theodore Roszak has advanced a similar argument in From Satori to Silicon Valley (1986), a monograph that traces the rise of the personal-computer industry to countercultural values of the period.