Steve Jurvetson: AI, Nanotech and the Future of the Human Species
Visionary venture capitalist, Steve Jurvetson, is quoting quantum-computing pioneer, Oxford Professor David Deutsch, who wrote in his controversial masterpiece, Fabric of Reality: "quantum computers can efficiently render every physically possible quantum environment, even when vast numbers of universes are interacting. Quantum computers can also efficiently solve certain mathematical problems, such as factorization, which are classically intractable, and can implement types of cryptography which are classically impossible. Quantum computation is a qualitatively new way of harnessing nature."
"Quantum computers," Jurvetson summarizes in a recent J-Curve blog, "can perform accurate simulations of any physical system of comparable complexity. The type of simulation that a quantum computer does results in an exact prediction of how a system will behave in nature, for example an iterative system, like a cellular automata, — something that is literally impossible for any traditional computer, no matter how powerful."
Quantum computing sounds like science fiction -as satellites, moon shots, and the original microprocessor
once were. But the age of computing in not even at the end of the beginning.
Traditional computing, with its ever more microscopic circuitry etched in silicon, will soon reach a final barrier: Moore's law, which dictates that the amount of computing power you can squeeze into the same space will double every 18 months, is on course to run smack into a silicon wall by 2015, due to overheating, caused by electrical charges running through ever more tightly packed circuits.
To leapfrog the silicon wall, we have to figure out how to manipulate the brain-bending rules of the quantum realm - an Alice in Wonderland world of subatomic particles that can be in two places at once.
Where a classical computer obeys the well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing.
The fundamental unit of information in quantum computing (called a quantum bit or qubit), is not binary but rather more quaternary in nature, which differs radically from the laws of classical physics.
A qubit can exist not only in a state
corresponding to the logical state 0 or 1 as in a classical bit, but
also in states corresponding to a blend or superposition of these
classical states. In other words, a qubit can exist as a zero, a one,
or simultaneously as both 0 and 1, with a numerical coefficient
representing the probability for each state. This may seem
counterintuitive because everyday phenomenon are governed by classical
Newtonian physics, not quantum mechanics -- which takes over at the atomic
The reason this is exciting is that it's derived from the massive quantum parallelism achieved through superposition, is the equivalent of performing the same operation on a classical super computer with ~10150 separate processors, which is impossible.
The idea of a computational device based on quantum mechanics was first explored in the 1970's and early 1980's by physicists and computer scientists such as Charles H. Bennett of the IBM Thomas J. Watson Research Center, Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of Oxford, and the late Richard P. Feynman, Nobel laureate of the California Institute of Technology were pondering the fundamental limits of computation.
They understood that if technology continued to abide by Moore's Law, then the continually shrinking size of circuitry packed onto silicon chips would eventually reach a point where individual elements would be no larger than a few atoms. Here a problem arose because at the atomic scale the physical laws that govern the behavior and properties of the circuit are inherently quantum mechanical in nature, not classical.
This then raised the question of whether a new kind of computer could be devised based on the principles of quantum physics.
Feynman was among the first to attempt to provide an answer to this question by producing an abstract model in 1982 that showed how a quantum system could be used to do computations. He also explained how such a machine would be able to act as a simulator for quantum physics. In other words, a physicist would have the ability to carry out experiments in quantum physics inside a quantum mechanical computer.
In 1985, Deutsch realized that Feynman's assertion could eventually lead to a general purpose quantum computer and published a crucial theoretical paper showing that any physical process, in principle, could be modeled perfectly by a quantum computer. Thus, a quantum computer would have capabilities far beyond those of any traditional classical computer. After Deutsch published this paper, the search began to find interesting applications for such a machine.
The breakthrough occurred in 1994 when Shor circulated a preprint of a paper in which he set out a method for using quantum computers to crack an important problem in number theory, namely factorization. He showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to enable a such a machine to factor huge numbers extremely rapidly, much faster than is possible on conventional computers.
With Shor's breakthrough, quantum computing
transformed from a mere academic curiosity directly into a national and
Quantum hardware, on the other hand, remains an emerging field, but the work done thus far suggests that it will only be a matter time before we have devices large enough to test Shor's and other quantum algorithms.
Beyond the actual creation of a quantum computer, our chief limitations are the imaginations of software engineers. This will be the major challenge of the Google whiz kids of tomorrow: to take computing and networking power that is effectively infinite and create interfaces that are simple enough for mere humans to understand.
Recent breakthroughs pioneered by Stuart Wolff of the University of Virginia allow us to take electricity out of the equation, and get rid of the overheating problem that is undercutting Moore's law. Single electrons have been made to adjust their spin. Subatomic circuitry is within our grasp.
Freescale Semiconductor, a Motorola spinoff, recently began commercial shipments of magnetic random-access memory (MRAM) chips. With the giant magnetoresistive effect, or GMR, electrons spin like a top or a billiard ball in some direction relative to a magnetic field. Flip the direction of the field, and the electron flips the direction of its spin. This very basic quantum effect can be used like a binary bit, its direction labeled "0" or "1" and employed to store digital information.
MRAM is the physics inside your digital camera that doesn't take any time to store a picture. Within a matter of years, your new laptop will switch on like a light.
The ability to control spin in a computational device - "spintronics" -has huge implications: not just an end to overheating worries but the possibility of moving computer technology into the molecular realm. With molecular-level chips, a laptop could have more computing power than trillions of today's supercomputers.
Harnessing the molecular-level computing power of this exponential growth means you can tackle any problem that gets exponentially larger, and there are lots of important ones. We can't reliably predict weather or traffic or the mutation of viruses today because the number of variables and possible interactions is too massive for today's computers.
Qubits would change that and usher in a breathtaking new world on infinite possibilities: from
computer ubiquity: painted onto walls, in chairs, in your body, communicating with one another constantly and requiring no more power than that which they can glean from radio frequencies in the air; to human-brain-imitating neural network and true (or near-true) artificial intelligence; ultrasonic technology that will beam video games into our brains linked up to a global network with infinite bandwidth, means that any sense can be stimulated in any way; "network-enabled telepathy" instead of cellphone conversations.
But real, useful quantum computers, for all their interest and potential, have proved fiendishly difficult to build. Until recently, quantum computers have been more-or-less successful lab experiments.
This past February, D-Wave Systems Inc. which claims to be the world’s first — and only — provider of quantum computing systems designed to run commercial applications, ran an initial demonstration of their Orion quantum computing system, which is built around a 16-qubit superconducting adiabatic quantum computer processor. However, since D-Wave Systems has not released the full details of Orion to the scientific community, many experts are skeptical of their claims.
Posted by Casey Kazan.
Related Galaxy post:
TrackBack URL for this entry:
Listed below are links to weblogs that reference Quantum Computing & the Future of the Human Species -A Galaxy Insight: