May 16th, 2006
ZD net reports on » the great Singularity debate. Here’s a primer if the Singularity (big ‘S’) is a new idea to you:
Sometime in the next few years or decades, humanity will become capable of surpassing the upper limit on intelligence that has held since the rise of the human species. We will become capable of technologically creating smarter-than-human intelligence, perhaps through enhancement of the human brain, direct links between computers and the brain, or Artificial Intelligence. This event is called the “Singularity” by analogy with the singularity at the center of a black hole – just as our current model of physics breaks down when it attempts to describe the center of a black hole, our model of the future breaks down once the future contains smarter-than-human minds. Since technology is the product of cognition, the Singularity is an effect that snowballs once it occurs – the first smart minds can create smarter minds, and smarter minds can produce still smarter minds.
Ridiculous sci-fi? Frightening vision of the future? Something to look forward to?
If we have superintelligent robots, the good news is that they will view us a pets; the bad news is they will view us food.
Not food, or batteries (cf. The Matrix), but if it ever did come to that, more likely as a distracting irrelevance.