Rating: Summary: A Quantum Leap for Computing Review: Your computer will soon be out of date. You know that already, especially if you know about Moore's law, which was originated forty years ago, and says that every year and a half, the density of components on a computer chip will double. From the room-sized vacuum tube monsters down to the sprightly laptop, there has been a continued decrease in size and increase in speed. But silicon technology cannot reduce forever; it is still based on atoms, and it cannot get smaller than an atom. There is no law, however, that says we must forever be dependent on silicon, and so entirely new technologies may be developed. The technology, undeveloped but promising, which has interested physicists and computer scientists the most is quantum computing. We don't have quantum computers yet, and they aren't a sure thing, but the possibilities are tantalizing. George Johnson, a science journalist, has tried to make the new technology plain in _A Shortcut Through Time: The Path to the Quantum Computer_ (Knopf), and for those of us who aren't mathematicians, physicists, or computer scientists, he has done an admirable job at making a very strange, not-yet-practical technology understandable. Few of us need to know how silicon chips work, and fewer still will ever understand how quantum computers will work. Indeed, the quantum world is so vastly strange and counterintuitive that no one really can understand it. But Johnson's book is a good introduction to the strangeness, and a good vantage point from which to watch the upcoming revolution, if it comes.Johnson's book is about a real quantum leap. The classical physics of our silicon computers does not hold within the tiny spaces inside atoms. Single particles at that scale can _really_ be in two places at once, and similarly, a quantum bit of information (known as a qubit) can be set to 1 and 0 at the same time, known as a "superposition." Qubits could be set to perform almost instantaneous calculations of huge programs, and there is no part of physics that says such computing should be impossible. Indeed, on the smallest of scales, primitive quantum computing has already been accomplished. Qubits are temperamental, and current research has to be done at supercold temperatures without the possibility of disturbance. Still, there is enormous intellectual interest in the prospect of quantum computing. One researcher in the field said that he and his colleagues are "writing the software for a device that does not yet exist." If quantum computing works, for instance, we will have to rethink all our current encryption methods, which are based on the difficulty of factoring large numbers; quantum computers do such things with ease silicon never can. You aren't going to understand quantum computers by reading this book; Johnson knows that he is trying to describe the undescribable, and he makes it clear that he is no physicist, just someone trying to understand what all the fuss is about. His book is lucid and his descriptions do not bog down in technicalities (at times he gleefully hurtles over them). The book is also brief, but has enough substance to give even those who know little about current computing some basic understanding of where quantum computers may take us. He has successfully conveyed the excitement these potential gadgets have sparked, and readers will be able to participate in the excitement themselves.
|