Mimicking "Deep Thought" in Douglas Adam's Hitchhiker's Guide to the Galaxy, designed by hyper-intelligent pan-dimensional beings who wanted to know the answer to 'Life, the Universe, and Everything,' Hod Lipson, director of Cornell University's Computational Synthesis Lab, developed an intelligent machine to "uncover the fundamental laws of nature". The machine is able to derive the laws of physics such as gravitation by processing the raw information.
Burning the midnight oil one evening, graduate student Michael Schmidt, a specialist in computational biology, noticed an oddly familiar equation pop up on his monitor. His computer was crunching the data from an experiment to measure the chaotic motion of a double pendulum, a type that has one swinging arm hanging from another, he told New Scientist in an interview.
"Schmidt recorded this movement using a motion tracking camera which fed numbers into his computer. What he was looking for was an equation describing the motion of the pendulums.
"Initially, the task looked hopeless," he continued. "When a double pendulum moves chaotically, its arms swing in a way that is almost impossible to predict, with seemingly no pattern whatsoever. For a human, finding an equation for this would be almost impossible. And yet the computer found something. To Schmidt, a PhD student studying computer science at Cornell University in Ithaca, New York, it was a hugely significant moment.
"It's probably the most exciting thing that has happened to me in science," he told New Scientist.
Schmidt's evolutionary computer had found one of the immutable laws of nature: the law of conservation of energy, which says you can never add or take energy away from a system. What had taken scientists hundreds of years to discover took his computer just one day.
Their process begins by taking the derivatives of every variable observed with respect to every other -- a mathematical way of measuring how one quantity changes as another changes. Then the computer creates equations at random using various constants and variables from the data. It tests these against the known derivatives, keeps the equations that come closest to predicting correctly, modifies them at random and tests again, repeating until it literally evolves a set of equations that accurately describe the behavior of the real system.
Technically, the computer does not output equations, but finds "invariants" -- mathematical expressions that remain true all the time, from which human insights can derive equations.
"Even though it looks like it's changing erratically, there is always something deeper there that is always constant," Lipson explained. "That's the hint to the underlying physics. You want something that doesn't change, but the relationship between the variables in it changes in a way that's similar to what we see in the real system." Lipson's work focuses on evolutionary robotics, artificial life, and creating machines that can demonstrate aspects of human creativity.
Once the invariants are found, potentially all equations describing the system are available: "All equations regarding a system must fit into and satisfy the invariants," Schmidt said. "But of course we still need a human interpreter to take this step."
The researchers tested the method with apparatus used in freshman physics courses: a spring-loaded linear oscillator, a single pendulum and a double pendulum. Given data on position and velocity over time, the computer found energy laws, and for the pendulum, the law of conservation of momentum. Given acceleration, it produced Newton's second law of motion.
The researchers point out that the computer evolves these laws without any prior knowledge of physics, kinematics or geometry. But evolution takes time. On a parallel computer with 32 processors, simple linear motion could be analyzed in a few minutes, but the complex double pendulum required 30 to 40 hours of computation. The researchers found that seeding the complex pendulum problem with terms from equations for the simple pendulum cut processing time to seven or eight hours. This "bootstrapping," they said, is similar to the way human scientists build on previous work.
Computers will not make scientists obsolete, the researchers conclude. Rather, they said, the computer can take over the grunt work, helping scientists focus quickly on the interesting phenomena and interpret their meaning.
The next great leap forward will be the evolution of quantum computers that have the potential to solve problems that would take a classical computer longer than the age of the universe.
Oxford Professor David Deutsch, quantum-computing pioneer, who wrote in his controversial masterpiece, Fabric of Reality says: "quantum computers can efficiently render every physically possible quantum environment, even when vast numbers of universes are interacting. Quantum computers can also efficiently solve certain mathematical problems, such as factorization, which are classically intractable, and can implement types of cryptography which are classically impossible. Quantum computation is a qualitatively new way of harnessing nature."
Quantum computing sounds like science fiction -- as satellites, moon shots, and the original microprocessor once were. But the age of computing is not even at the end of the beginning.
Traditional computing, with its ever more microscopic circuitry etched in silicon, will soon reach a final barrier: Moore's law, which dictates that the amount of computing power you can squeeze into the same space will double every 18 months, is on course to run smack into a silicon wall by 2015, due to overheating, caused by electrical charges running through ever more tightly packed circuits.
To leapfrog the silicon wall, we have to figure out how to manipulate the brain-bending rules of the quantum realm -- an Alice in Wonderland world of subatomic particles that can be in two places at once.
Where a classical computer obeys the well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing.