I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.
Richard Brautigan, 1967
Contents
WHAT DOES CYBER EVEN MEAN? AND WHERE DOES THE idea come from?
Many people have been asking this question for some time now. My students at Kings College London did, as did cyber warfare officers in the US Air Force and Pentagon strategists. Secretive British spies inquired, as did bankers and hackers and scholars. They all struggled with the rise of computer networks, and what that rise means for our security and our liberty. And they all slapped the prefix cyber in front of something else, as in cyberspace or cyberwar, to make it sound more techy, more edgy, more timely, more compellingand sometimes more ironic.
I didnt have a good answer for them, despite having written widely on cybersecurity. This was frustrating. Like so many others, I could only point, vaguely, to a curious origin story: that of a science fiction novel written in the mid-1980s, William Gibsons Neuromancer . The indefatigable Gibson plucked cyberspace out of his airy imagination, the story goes, and then punched the future into his olive-green portable typewriter, a Hermes made in 1927. For Gibson, that new electronic space inside the machines, meaningless yet evocative, was the perfect replacement for outer space as a playground for the protagonists of his science fiction novels. He simply wanted an unblemished stage. That origin story, as improbable as it is, has been repeated countless times. Remarkably, one of the biggest challenges to our civil liberties and state sovereignty in the twenty-first century was commonly traced to a fictional story about a drug addict escaping into hallucinatory computer networks.
Could that be right? It was as if history started in 1982 with a fantasy. How did Gibsons story fit into a larger picture, into a cultural and technical trajectory that takes a longer view? How exactly had cyberspace made the leap from Gibsons solitary typewriter to the Pentagons futuristic and lavishly staffed Cyber Command? By 2010, a fast-growing share of the work of the National Security Agencyand of its British counterpart, Government Communications Headquarters (GCHQ)was somehow cyber related, as jargon had it.
Then came the great intelligence leaks of 2013. The NSAs and GCHQs publicly revealed technological capabilities outraged privacy activists and some alliesbut humbled many of the worlds most fearsome spy agencies; the leaks demonstrated a capability gap to us, one Chinese intelligence officer told me in Beijing later that year. China had to catch up. Meanwhile, major computer network breaches escalated, with foreign spies and criminals siphoning off vast amounts of intellectual property and sensitive personal information. By 2015, the global cybersecurity market of firms offering a smattering of security solutions had surpassed $75 billion, swelling at double-digit rates. So precarious were the new threats that even in times of economic hardship and austerity, government and military budget lines werent just safe from cuts, but sloping up fast.
Yet the real story of one of the worlds most exciting, most expensive, and most menacing ideas remained an enigma. So where does cyber come from? What is the history of this idea? And what does it actually mean?
I started digging. Rise of the Machines is what I found.
Cyber is a chameleon. For politicians in Washington, the word stands for power outages that could plunge entire cities into chaos at any moment. For spies in Maryland, it stands for conflict and war, and for data being stolen by Russian criminals and Chinese spies. For executives in the City of London, it stands for major security breaches, for banks bleeding money, and for ruined corporate reputations. For inventors in Tel Aviv, it triggers visions of humans merging with machines, of wired-up prostheses with sensitive fingertips, and of silicon chips implanted under tender human skin. For science fiction fans in Tokyo, it stands for an escapist yet retro punk aesthetic, for mirrored shades, leather jackets, and worn-down, dusty gadgets. For romantic internet activists in Boston, it stands for a new realm of freedom, a space beyond the control of oppressive governments and law enforcement agencies. For engineers in Munich, it stands for steely control, and for running chemical plants by computer console. Ageing hippies in San Francisco nostalgically think back to wholeness and psychedelics and turning on the brain. And for screen-addicted youth in between, cyber means simply sex-by-video-chat. The word refuses to be either noun or prefix. Its meaning is equally evasive, hazy, and uncertain. Whatever it is, it is always stirring, it is always about the future, and it always has been.
One way to clear the fog is to study the history of one of the weightiest and most pivotal ideas of the twentieth century, an idea whose legacy is set to be even more momentous as the twenty-first century moves on: cybernetics . Cybernetics was a general theory of machines, a curious postwar scientific discipline that sought to master the swift rise of computerized progress. From the get-go in the early 1940s, it was about computers, control, security, and the ever-evolving interaction between humans and machines.
A crucial moment, it turns out, was World War IIin particular, the air defense problem that emerged as this epic confrontation got under way. To shoot down deadly new bombers, ground-based artillery needed complex ballistic calculations, completed faster and more accurately than human computers could perform, or even read off precalculated range tables. Machines needed to be invented for the task. And soon mechanical brains started to think, in the quaint language of the time. The rise of the machines had begun.
In 1940, in the midst of all this, a curious story ran its course at the vast campus of the Massachusetts Institute of Technology. Norbert Wiener, an eccentric mathematician, read about howitzers and artillery shells and was inspired. Shooting at the blue sky, aided by creaking computers, appealed to the roly-poly professor. After the war he took a tangled set of ideas from electrical engineers and weapons designers, straightened it out, refined it, repackaged it, and, with a generous gesture, threw his creation out to an eager public like candy to a throng of hungry children.
The timing was perfect. At the end of the decade, the technological wonders of the military effort were beginning to seep into industry and private households. Somebody needed to explain the new gadgetry and its purpose. Enter cybernetics, the bold theory of future machines and their potential. Wiener and his keen acolytes would enchant the machine; seduced by their own theory, they endowed it with spirit and an appeal that would extend to the cultish. Engineers, military thinkers, politicians, scholars, artists, and activists started projecting their hopes and their fears into the future of thinking machines.
The postwar rise of the machines spans a wide arch: the most crucial anchor point of that arch emerged in the late 1940s with the publication of Wieners epoch-making book Cybernetics . The tweedy scholar with thick, horn-rimmed glasses revealed the magic of feedback loops, of self-stabilizing systems, of machines that could autonomously adapt their behavior and learn. Automata now had a purpose and could even self-reproduce, at least in theory. The machine suddenly seemed lifelike.
Next page