Suppose AI suspects that humans might interfere with its plans, writes philosopher Nick Bostrom, who's at the University of Oxford in the United Kingdom. It could decide to build tiny weapons and distribute them around the world covertly. "At a pre-set time, nanofactories producing nerve gas or target-seeking mosquito-like robots might then burgeon forth simultaneously from every square meter of the globe."
For Bostrom and a number of other scientists and philosophers, writes Kai Kupferschmidt in today's Science, such scenarios are more than science fiction. They're studying which technological advances pose "existential risks" that could wipe out humanity or at least end civilization as we know it—and what could be done to stop them. "Think of what we're trying to do as providing a scientific red team for the things that could threaten our species," says philosopher Huw Price, who heads the Center for the Study of Existential Risk (CSER) here at the University of Cambridge.
The idea of science eliminating the human race can be traced all the way back to Frankenstein. In Mary Shelley's novel, the monster gets angry at his creator, Victor Frankenstein, for having spurned him. He kills Frankenstein's little brother William, but then offers the doctor a deal: Make a female companion for me and we will leave you in peace and go to South America to live out our days. Frankenstein starts working on the bride, but realizes that the couple might reproduce and outcompete humans: "A race of devils would be propagated upon the earth who might make the very existence of the species of man a condition precarious and full of terror." He destroys the half-finished female, reigniting the creature's wrath and bringing about his own demise.
"I think Frankenstein illustrates the point beautifully," says physicist Max Tegmark of the Massachusetts Institute of Technology (MIT) in Cambridge, a board member of CSER and a co-founder of a similar think tank, the Future of Life Institute (FLI), near MIT. "We humans gradually develop ever-more-powerful technology, and the more powerful the tech becomes, the more careful we have to be, so we don't screw up with it."
Image credit: Borg image from Star Trek