From Tyler Cowen comes Bruce Sterling’s interesting answer to the 2013 Edge symposium question: What should (or shouldn’t) we be worried about? Since his response is a short one, and since there are a lot of responses in the symposium that are worth reading, we’ll quote Sterling in full below, but we encourage you to read the whole thing:
Since it’s 2013, twenty years have passed since Vernor Vinge wrote his remarkably interesting essay about “the Singularity.”
This aging sci-fi notion has lost its conceptual teeth. Plus, its chief evangelist, visionary Ray Kurzweil, just got a straight engineering job with Google. Despite its weird fondness for AR goggles and self-driving cars, Google is not going to finance any eschatological cataclysm in which superhuman intelligence abruptly ends the human era. Google is a firmly commercial enterprise.
It’s just not happening. All the symptoms are absent. Computer hardware is not accelerating on any exponential runway beyond all hope of control. We’re no closer to “self-aware” machines than we were in the remote 1960s. Modern wireless devices in a modern Cloud are an entirely different cyber-paradigm than imaginary 1990s “minds on nonbiological substrates” that might allegedly have the “computational power of a human brain.” A Singularity has no business model, no major power group in our society is interested in provoking one, nobody who matters sees any reason to create one, there’s no there there.
So, as a Pope once remarked, “Be not afraid.” We’re getting what Vinge predicted would happen without a Singularity, which is “a glut of technical riches never properly absorbed.” There’s all kinds of mayhem in that junkyard, but the AI Rapture isn’t lurking in there. It’s no more to be fretted about than a landing of Martian tripods.
We at Via Meadia are big fans of both Vernor Vinge and Bruce Sterling. They write great page-turning science fiction, but more than that they are deep thinkers who have a way of fruitfully peering into our future, even if their prophecies don’t always pan out perfectly. Sterling is certainly right to point out that the idea of the AI singularity isn’t aging well. The emergence of super-intelligent, self-aware machines, capable of hitching a ride on Moore’s law and improving themselves until we poor humans are left in the dust, looks more and more like a mirage on the horizon—forever near, yet forever out of reach. (Of course, radically discontinuous change always looks out of reach — until it happens!)
Here on the blog, however, we think of the Singularity in broader terms than Sterling. Our concept is closer to John von Neumann’s idea of the “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” This kind of singularity refers to any fundamental, discontinuous break in the human story that changes the human condition: a civilization-ending nuclear war, medical discoveries that lead to essentially infinite lifespans either in medically treated bodies or somehow “uploaded” into machines, runaway global warming leading to Condition Venus, self replicating nano-technology escaping the lab and turning the world into gray goo (kind of like Kurt Vonnegut’s Ice-nine). There are many other scenarios that would qualify as singularities. Frank Fukuyama has pointed to what you could call a “soft singularity” in which new varieties of psychoactive drugs like Adderall and Prozac increasingly turn consciousness from a given produced by interaction with the outside world into something that we determine for ourselves by varying our drug dosage. Just as Einsteinian physics breaks down inside a black hole, these technological singularities would signal some kind of fundamental breakdown of social order.
Our century will continue to be haunted by the potential for singularities—if not the AI apocalypse that some fear and others would welcome. Indeed as Sterling concludes his Edge essay, “there’s all kinds of mayhem” out there right now, waiting to be discovered. Whether we actually experience a singularity, much less what the singularity would be like and what life would look like on the other side, are all things we cannot know.
The computers may not be taking over just yet, but the 21st century remains an apocalyptic time in which politics and culture are haunted by the prospect of radical, discontinuous change that could break in on our lives at almost any time.