Is the Age of Silicon Computing Coming to an End? Physicist Michio Kaku Says "Yes"
Follow the Daily Galaxy
Add Daily Galaxy to igoogle page AddThis Feed Button Join The Daily Galaxy Group on Facebook Follow The Daily Galaxy Group on twitter
 

« EcoAlert: Satellite Images Reveal an Ancient Network of Rivers in Arabian Desert | Main | Image of the Day: Supermassive Black Hole Devouring a Star »

May 01, 2012

Is the Age of Silicon Computing Coming to an End? Physicist Michio Kaku Says "Yes"

 

           6a00d8341bf7f753ef00e5514a279a8834-800wi


Traditional computing, with its ever more microscopic circuitry etched in silicon, will soon reach a final barrier: Moore's law, which dictates that the amount of computing power you can squeeze into the same space will double every 18 months, is on course to run smack into a silicon wall due to  overheating, caused by electrical charges running through ever more tightly packed circuits.

"In about ten years or so, we will see the collapse of Moore’s Law. In fact, already, already we see a slowing down of Moore’s Law," says world-renowned physicist, Michio Kaku. "Computer power simply cannot maintain its rapid exponential rise using standard silicon technology."

According to Kaku, at the International Supercomputing Conference 2011 last June, Intel architecture group VP Kirk Skaugen said something about Moore’s Law not being sufficient, by itself, for the company to ramp up to exascale performance by 2018. But he went on to tout Intel’s tri-gate technology (the company’s so-called “3D” processors) as the solution, which Skaugen announced means “no more end of life for Moore’s Law.”

Despite Intel’s recent advances with tri-gate processors, Kaku argues in a video interview with Big Think, that the company has merely delayed the inevitable: the law’s collapse due to heat and leakage issues.

“So there is an ultimate limit set by the laws of thermal dynamics and set by the laws of quantum mechanics as to how much computing power you can do with silicon,” says Kaku, noting “That’s the reason why the age of silicon will eventually come to a close,” and arguing that Moore’s Law could “flatten out completely” by 2022."

Kaku see several alternatives to the demise of Moores Law: protein computers, DNA computers, optical computers, quantum computers and molecular computers.

"If I were to put money on the table I would say that in the next ten years as Moore’s Law slows down, we will tweak it. We will tweak it with three-dimensional chips, maybe optical chips, tweak it with known technology pushing the limits, squeezing what we can. Sooner or later even three-dimensional chips, even parallel processing, will be exhausted and we’ll have to go to the post-silicon era,” says Kaku.

Kaku concludes that when Moore’s Law finally collapses by the end of the next decade, we’ll “simply tweak it a bit with chip-like computers in three dimensions. We may have to go to molecular computers and perhaps late in the 21st century quantum computers.”

We'll place our bets on quantum computing.

"Quantum computers can efficiently render every physically possible quantum environment, even when vast numbers of universes are interacting. Quantum computation is a qualitatively new way of harnessing nature,"  according to David Deutch, an Israeli-British physicist at the University of Oxford who pioneered the field of quantum computation and is a proponent of the many-worlds interpretation of quantum mechanics. Quantum computers, says Deutch, have the potential to solve problems that would take a classical computer longer than the age of the universe.

Astrophysicist Paul Davies at Arizona State University proposes that information, not mathematics, is the foundation on which physical reality, the laws of nature, are is constructed. Meanwhile at MIT, computer scientist Seth Lloyd, develops Davies assumption, by treating quantum events as "quantum bits," or qubits,  as the way whereby the universe "registers itself."  

Lloyd proposes that information is a quantifiable physical value, as much as mass or motion -that any physical system--a river, you, the universe--is a quantum mechanical computer. Lloyd has calculated that "a computer made up of all the energy in the entire known universe (that is, within the visible “horizon” of forty-two billion light-years) can store about 1092 bits of information and can perform 10105 computations/second."

The universe itself is a quantum computer, Lloyd says, and it has made a mind-boggling 10122/sec computations since the Big Bang (for that part of the universe within the “horizon”).

In Year Million: Science at the Far Edge of Knowledge, leading and up-and-coming scientists and science writers cast their minds one million years into the future to imagine the fate of the human and/or extraterrestrial galaxy. First attempted by H. G. Wells in his 1893 essay “The Man of the Year Million”—is an exploration into a barely conceivable distant future, where the authors confront possibilities facing future generations of Homo Sapiens. How would the galaxy look if it were redesigned for optimal energy use and maximized intelligence? What is a universe bereft of stars?

Lloyd has proposed that a black hole could serve as a quantum computer and data storage bank. In black holes, he says, Hawking radiation, which escapes the black hole, unintentionally carries information about material inside the black hole. This is because the matter falling into the black hole becomes entangled with the radiation leaving its vicinity, and this radiation captures information on nearly all the matter that falls into the black hole. 

“We might be able to figure out a way to essentially program the black hole by putting in the right collection of matter,” he suggests.

There is a supermassive black hole in the center of our galaxy, perhaps the remnant of an ancient quasar. Could the Milky Way's supermassive black hole (in image above) become the mainframe and central file sharing system for galaxy hackers of the Year Million? A swarm of ten thousand or more smaller black holes may be orbiting it. Might they be able to act as distributed computing nodes and a storage network? 

Toward the Year 1,000,000 AD an archival network between stars and between galaxies could develop an Encyclopedia Universica, storing critical information about the universe at multiple redundant locations in those and many other black holes.

Quantum computing per MIT's Lloyd sounds like science fiction -as satellites, moon shots, and the original microprocessor once were.  But the age of computing in not even at the end of the beginning. 

To leapfrog the silicon wall, we have to figure out how to manipulate the brain-bending rules of the quantum realm - an Alice in Wonderland world of subatomic particles that can be in two places at once. Where a classical computer obeys the well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing.

The fundamental unit of information in quantum computing (called a quantum bit or qubit), is not binary but rather more quaternary in nature, which differs radically from the laws of classical physics.A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states.

In other words, a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient representing the probability for each state.  This may seem counter-intuitive because everyday phenomenon are governed by classical Newtonian physics, not quantum mechanics -- which takes over at the atomic level.  

The reason this is exciting is that it's derived from the massive quantum parallelism achieved through superposition, is the equivalent of performing the same operation on a classical super computer with ~10150 separate processors, which is impossible.

The idea of a computational device based on quantum mechanics was first explored in the 1970's and early 1980's by physicists and computer scientists such as Charles H. Bennett of the IBM Thomas J. Watson Research Center,  Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of Oxford, and the late Richard P. Feynman, Nobel laureate of the California Institute of Technology were pondering the fundamental limits of computation.

They understood that if technology continued to abide by Moore's Law, then the continually shrinking size of circuitry packed onto silicon chips would eventually reach a point where individual elements would be no larger than a few atoms.  Here a problem arose because at the atomic scale the physical laws that govern the behavior and properties of the circuit are inherently quantum mechanical in nature, not classical. 

This then raised the question of whether a new kind of computer could be devised based on the principles of quantum physics.

Feynman was among the first to attempt to provide an answer to this question by producing an abstract model in 1982 that showed how a quantum system could be used to do computations.  He also explained how such a machine would be able to act as a simulator for quantum physics.  In other words, a physicist would have the ability to carry out experiments in quantum physics inside a quantum mechanical computer.

In 1985, Deutsch realized that Feynman's assertion could eventually lead to a general purpose quantum computer and published a crucial theoretical paper showing that any physical process, in principle, could be modeled perfectly by a quantum computer.  Thus, a quantum computer would have capabilities far beyond those of any traditional classical computer.  After Deutsch published this paper, the search began to find interesting applications for such a machine.

The breakthrough occurred in 1994 when Shor circulated a preprint of a paper in which he set out a method for using quantum computers to crack an important problem in number theory, namely factorization.  He showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to enable a such a machine to factor huge numbers extremely rapidly, much faster than is possible on conventional computers. 

With Shor's breakthrough, quantum computing transformed from a mere academic curiosity directly into a national and world interest. Quantum hardware, on the other hand, remains an emerging field, but the work done thus far suggests that it will only be a matter time before we have devices large enough to test Shor's and other quantum algorithms. 

Beyond the actual creation of a quantum computer, our chief limitations are the imaginations of software engineers. This will be the major challenge of the Google whiz kids of tomorrow: to take computing and networking power that is effectively infinite and create interfaces that are simple enough for mere humans to understand.

Recent breakthroughs pioneered by Stuart Wolff of the University of Virginia  allow us to take electricity out of the equation, and get rid of the overheating problem that is undercutting Moore's law. Single electrons have been made to adjust their spin. Subatomic circuitry is within our grasp.

The Daily Galaxy via geek.com and http://techland.time.com

Additional sources:

Information and the Nature of Reality, Paul Davies and Niels Henrik Gregersen(2011)

http://www.qubit.org/people/david/David.htmlhttp://blogs.zdnet.com/BTL/?p=6050

http://www.cs.caltech.edu/~westside/quantum-intro.html

View Today's Hot Tech News Video from IDG -Publishers of PC World, MacWorld, and Computerworld--Top Right of Page  

To launch the video click on the Start Arrow. Our thanks for your support! It allows us to bring you the news daily about the discoveries, people and events changing our planet and our knowledge of the Universe.



Comments

My comments:

1) I think Dr. Kaku is being either generous on the survival of silicon computing, or pessimistic on the advent of quantum computing. The way things have been going, I suspect (going purely on instinctive sense here) that we'll start to have practical quantum computing by 2025, not late in the century as he postulates.

On the whole, though, I tend to agree.

2) Is TDG still having troubles with using the carat symbol, or did Dr. Lloyd really say that "a computer made up of all the energy in the entire known universe (that is, within the visible “horizon” of forty-two billion light-years) can store about 1092 bits of information and can perform 10105 computations/second." That seems like a pretty lame computer if it's the latter. Are you sure it's not 10^92 and 10^105?

Is there a good reason(s) why the discussion of quantum computing places a binary limit on qubit degeneracy? Is it that pen-and-pencil, or brain power, can only handle the relatively simple superposition calculations for binary qubits? Seems that, imho, once there is/are operational quantum computers then they can be used to program arrays of qubits whose positional degeneracy is larger than binary….e.g. some glass-like matrices of “frozen” transition metal ions, or even super cold sub-atomics and even relatively macro-sytems of conjugated hetero-organics.

Personally speaking, wouldn’t we then come into a new computing realm whose characterizing “law”, instead of “Moore’s Law” which applies to storage, would be a tracking of rate of information processing (how much information is processed per unit time) combined with storage - a kind of performance density metric. Relative to a quantum computer’s processing time, the storage basis would be a an approximate constant, but the combined metric would be a truer indication of technical advancement.

This is a bit off topic, but I honestly don't think "we" as in humans will be talking about making better computers in one hundred years. We neglect the fact that the current silicon scale we "already have" is already more refined(nanometers) than the neural connections in the human brain (100 nm to 1 micrometer), it is just of lower capacity now simply because it is two dimensional rather than three. Cramming the components of even todays computers into a truly 3 dimensional space with simple parallelization is going be an astronomical step forward, without even the need for quantum computing. Such a machine would have more complexity than the human brain of the same size and yet be able to think much much faster (compare 7 m/s conduction of neural axons to 300,000 m/s for electric circuts) it works out to something like 100 years of thinking in a single day of such a beast... the atomic bomb of the information age perhaps! That is a step forward which is incomprehensible for any of us already and we do not have the capacity to even understand its possibilities. What will be the meaning of the human race, when these machines can write better books and music and science than Shakespeare, Beethoven, or Einstein. So even tweaking todays technology, we are well on our way to creating something very superior to ourselves where all questions of our advancement along these lines are going to be quite moot. I think all this talk we humans going on to build quantum computers is nonsense. We will most certainly be obsolete or combined with machines by that point already with today technology just tweaked a bit more and scaled up.

why not quantumize the brain? "made for thinking"

I agree with Bob Greenwade's remarks that there are mis-quotes or missing symbols re Dr Lloyd's [somewhat fanciful and fantastical] estimations. In fact the article as a whole is all over the place, jumping from subject to subject willy-nilly.

Daily Galaxy is a nice blog but the articles should perhaps be chosen for their substance and the way in which the material is developed... for the credibility and believability of arguments contained therein.

Dr. Kaku has talked about a very feasible and practical use of Quantum Computing. Even though the idea isn't new to a part of the population, no serious action has been taken to employ this technology in totality.

TJ: I think only if there have sufficient AI capabilities and anr truly capable of free thinking. For us to create that as humans would basically make us "god" to a new life form. I don't think machines will be making music and writing books anytime soon.

The manipulation of individual atoms in nano-arrays is well on its way. Surely a matter of less than a decade until this engineering provides for machines with immensely enriched or quantum capacity. This may imply a capacity for self-awareness if that's what we want. We fantasize about this, write novels and papers, make films and so on. We hope for, lust for and fear the consequences, populating our dreams and nightmares with terminators, mechanical man-fridays and cyber-erotic humanoids [or Nexus-6 pleasure models]. Personally, I think we're lonely and very possibly, a tad horny.

The promise of fusion based energy systems,just 50 years ago seems akin to current quantum computing predictions. Billions have been spent building woefully inefficient reactors on the "technology will advance" anthem. Basically, landing a man on the moon and retrieving him has addled our senses to believe tomorrow's possibilities outweigh man's desire for personal power, wealth and domination. The Romans didn't imagine that the Visigoths would eliminate their entire society in a year. We shall not predict beyond 20 years any better, regardless of technical proficiency. Technology never drives power.

Even if we are eventually going to smack into the limiting wall with Silicon, I don't think we have realized the potential of what we have yet. Both Intel and HP have demonstrated 80 and 100 core CPUs I think in the 45nm range. We are already producing only dual, quad, 8 and 16 core CPUs in 32nm and 28nm silicon. We have Flash memory hitting 19nm. I have heard the final limit on silicon because of electron tunneling is 5nm. You can get at least 25 times denser with 5nm silicon that we currently are with 28 and 32. 25X16 cores in one chip is nothing to sneeze at.

Now they are optically coupling layers together, demonstrating some of the first 3D chips with built in DRAM. I think our hardware will far exceed our ability to program and make use of this silicon in just the next 10-20 years. We are already horribly inefficient in OS construction as it is. We have no idea what do with 100+ cores when we have them.

We can also always make more silicon real estate too. Chips are actually getting cooler as distances are getting smaller since electrons don't have as far to go.

Imagine if we can put 256 cores in a future 15nm chip then quadruple the die size, giving you 1024 cores then build 16 layers with a TB of RAM added on for each layer? Thats 16,000 cores with coupled memory in a single chip!

Now make a Beowulf cluster of that! I think our real problem will be keeping up with software.

I think that optical computers will happen first after silicon, then quantum computers. Quantum computers are very far from being practical. We still haven't solved major quantum problems like synchronization. We are getting close to having optical computers, very close. By the time we exhaust the abilities of optical computers consumer computers will be vastly more powerful than the applications that are being run. Once we max out optical computers we will be running crisis on mobile phones.

POET. GaAs.
Could this be the solution?
http://poet-technologies.com/

Seems to put Silicon to shame.
This company looks close to bringing it to market.
Maybe a year off.


Post a comment

« EcoAlert: Satellite Images Reveal an Ancient Network of Rivers in Arabian Desert | Main | Image of the Day: Supermassive Black Hole Devouring a Star »




1


2


3


4


5


6


7


8





9


11


12


13


14


15

Our Partners

technology partners

A


19


B

About Us/Privacy Policy

For more information on The Daily Galaxy and to contact us please visit this page.



E