A TIME magazine article a few months back explores the idea of technological Singularity. The jist of it is that technology is advancing at a faster and faster rate and will eventual reach a point of near infinite growth, the point of Singularity. At this point, machines will become conscious, and “the human age will be over.” Now of course the article cites several science fiction authors, but it mainly quotes and profiles legitimate scientists. There are a lot of people who take this seriously. Many believe that it is inevitable. Some have put a date on it (2045 CE, to be exact). There is a significant group of scientists and inventors who are working toward it. They have their own Singularity convention that is described in the article as something between ComiCon and an academic symposia.
The article quotes the Singularity Movement’s detractors as accusing it of being a Silicon valley version of the Evangelical rapture; a bunch of sad, dissolusioned geeks looking to technology for salvation. This places the Singularity movement firmly within a growing trend of scientists and technologists whose faith in Science (with a capital S) to solve all our problems is absolute. One such person mentioned in the article is Cambridge trained biologist Aubrey de Grey, who believes that death is simply an illness and he’s looking for the cure, and seems to believe that merging human and (inevitable future) machine consciousness may be the key: the scientific version of everlasting life.
In my opinion the biggest problem with the Singularity movement is they stopped reading science fiction back in the early fifties when it was still optimistic and have neglected to read the science fiction of the past 5 decades. Maybe they’ve never seen a minor, underground, cult classic, indie film from the 80s–I’m sure you’ve never heard of it–it’s called, Terminator. Science fiction has been grappling with artificial intelligence and Singularity for a long damn time, which leads me to my favorite quote from the article, one that could have come directly out of a science fiction story (Lev Grossman, the author of the article is also a science fiction/fantasy author). It expresses my skepticism with clarity and wit:
“You don’t have to be a super-intelligent cyborg to understand that introducing a superior life-form into your own biosphere is a basic Darwinian error.”
Part of a (Long) Series of (Short) Posts about Science and Technology
The Tragic Irony of Technology Coltan, cellphones and being connected
Singularity, Progress, and Darwinian Common Sense Artificial Intelligence and Sciencism
Middleduction A post that would have made a nice introduction (coming soon)
Science Fiction as Prophetic Witness or Scientific Gospel? (coming soon)
Creating the Problem in order to Fix It (coming soon)
More on Sciencism (coming soon)
Kierkegaardian Dread (coming soon)