"Let an ultra-intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultra-intelligent machine is the last invention that man need ever make."
Good also predicted in 1965 that within 30 years humans would possess the capability to build machines smarter than ourselves. While he has subsequently revised this estimate (upwards), Good feels certain it will happen no later than 2030. Given that we are just beginning to explore what's possible with nanotechnology and Moore's Law shows no signs of slowing down, he may well be right.
The phrase technological singularity is a term of art among futurists and refers to the creation of an AI or enhanced human intelligence that begins to drive technological advancements farther, faster and beyond the ability of humans to participate. While there is substantial disagreement between futurists as to when this will happen and the potential impact, there is surprising agreement that superhuman intelligence will happen.
As far back as 1958, some scientists were already predicting that technology would one day drive mankind to the point where radical changes would occur in life as we know it. In a statement repeated for its prescience, Stanislaw Ulam, a polish mathematician who contributed greatly to the Manhattan Project, declared,
"One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."
—Ulam describing a conversation with John von Neumann (1958)
Leaving the geezers aside, Ray Kurzweill, admittedly amongst the more optimistic futurists, but one with a track record that should lend weight to anything the man says, feels certain the singularity is coming and that it will involve the integration of hardware with our wetware. Kurzweill is of the opinion that you will have the internet in your head, total recall and most assuredly never lost again! Recall Kramer blinking his eyes in order to activate the call waiting in his brain? You will no longer be unavailable. Worse, you may be running on Microsoft.
I hope Kurzweill's right. The only thing we have to fear is fear itself, to quote another eccentric. As a species we've moved past natural selection. We are now self-selecting. Had I been born just one hundred years ago, my mouth would have been a mess. Modern orthodontics, painless (relatively) dentistry. Never mind coke bottle lens that would have condemned me to a life of near blindness had I been born even a hundred years earlier.
I remember the first time my dad brought home a remote control TV. Old school dial phones. I played Tank on the Atari 2600 and tried to teach myself to program. I couldn't get past chapter 3 in the little spiral bound notebook of instructions that came with my Commodore 64, but I figured out how to play Zork quick enough. Now we have chess computers capable of beating Grand Masters, although the best among us can still hold their own. I was regularly defeated by a machine from Sears.
Ray Kurzweill has this to say,
"Within 25 years, we'll reverse-engineer the brain and go on to develop super-intelligence. Extrapolating the exponential growth of computational capacity (a factor of at least 1000 per decade), we'll expand inward to the fine forces, such as strings and quarks, and outward. Assuming we could overcome the speed of light limitation, within 300 years we would saturate the whole universe with our intelligence."
The Singularity will happen. It's inevitable. Humanity will assimilate itself. Science is about to rock our world!