Will quantum computers ever work?
Quantum Computers
See further reading for a list of books related to this subject |
Quantum computing is tipped to become a technology of major importance in the next few decades. But is it really going to happen, or is it all hype?
The idea of a quantum computer is that a quantum system can be in a superposition of many states. If the system is set to evolve in such a way that it performs a calculation, then the same calculation can be done on each of the states of the superposition, so the system can do many calculations in parallel. This leads to the possibility of much more powerful computers than would be possible with ordinary processors.
Beginnings
The idea of a quantum computer didn't originate from computer designers trying to make faster computers. Rather it was from the theoretical ideas of physicists. Richard Feynman saw that the complexity of quantum mechanical calculations grew exponentially with the number of particles and reasoned that one way to do such calculations would be to use quantum systems as the computing device. David Deutsch was looking for a way to demonstrate the Many-Worlds version of quantum theory, and saw that a device which carried out different parts of a calculation in different universes would be a possible candidate for this
.
Early problems and their solutions
At first it was hard to see whether quantum computers would be of general use. They might not actually provide any speedup for the sort of problems people actually wanted computers for. Then in 1994 Peter Shor devised a quantum computer algorithm for factoring numbers. The difficulty of factoring large numbers is crucial to computer security, and so the area then began to attract a great deal of interest. The problem then arose that since quantum computers depended on a single quantum system, they couldn't be designed to be robust, and so errors were bound to creep in. Since most calculations can be wrecked by a single error, this threatened to make quantum computers useless. It was solved by the development of error-correcting codes.
15=5×3
The announcement of the 2001 result is at IBM's Test-Tube Quantum Computer Makes History . Interestingly there doesn't seem to have been any advance on this in the subsequent 3 years. |
The current position is that simple quantum computers have been implemented with a variety of technology. Cavity QED, Ion traps, NMR are the three main methods. They have been used to make quantum computers with a few qubits, but hundreds or thousands of qubits will be required for a quantum computer which does anything useful. In 2001 researchers managed to factor 15 using a quantum computer. Not very impressive maybe, but at least it showed the field is progressing, and the more advanced uses of quantum computers might become possible.
Doubts
Some people though have their doubts about whether the field will progress as thought. Firstly, although error correcting codes are able to deal with the errors introduced, this is always at a cost of having to use a larger system than would otherwise have been necessary. It is hard to be sure that a working system wouldn't require so much error correcting capacity as to cancel out any computational benefit in a working system. Secondly, quantum computers require reality to match theory to very high accuracy. Although it is claimed that the theory of quantum mechanics has been shown to match experiment to whatever accuracy is required, this claim should be taken with a pinch of salt - it only applies to simple systems. The fact is that full quantum mechanical calculations simply become impossible to do with systems of any reasonable complexity.
A single molecule?
The article can be found at Digital image stored in single molecule |
As an example of the hype in this area, a couple of years ago it was reported that 1024 bits could be stored in 19 hydrogen atoms a single molecule. Unbelievable! I thought, and I was right - it wasn't true. What was actually happening was that NMR was being used to store the information in a liquid crystal which is contains a huge number of molecules. The point was that the data wasn't split up into different places, and the sample could be made smaller and still store the same amount of data, but it would still be way more than a single molecule.
The extended Church-Turing thesis
If you're interested in protein folding then see Folding@Home |
The Church-Turing thesis is the idea that any one physical realisation of a computer can be simulated by any other. Quantum computers do not violate this. What they do violate is the Extended Church-Turing Thesis which says that the efficiency of the simulation will be a polynomial function of some aspect of the system. Although most accounts present quantum computing as a radical departure for a community which believed the ECCT, in a way the ECCT is more radical - it asserts that no problems based in reality are that hard to solve. If you believe it (and I'm beginning to) then there must be a polynomial time algorithm for problems such as protein folding. Instead of using large networks of computers, maybe problems could be solved with a single computer - with the right program.
My Ideas
When I first heard the name 'quantum computers', I thought, yes, microelectronics will continue getting smaller and smaller until it reaches the quantum level. However, that's not what quantum computing is about. In fact quantum computers involve large items of apparatus to access the information, and some versions such as NMR computation use macroscopic sample sizes. This set me thinking. What if, I thought, the extra computing power wasn't concentrated in a few atoms, but was distributed throughout the apparatus? We all know that quantum theory is about spooky links in systems. This would change the interpretation from one of doing calculations in parallel universes to one of doing calculations in parallel atoms - a lot less weird.
Then quantum and classical computers would face the same sort of limits. Note that this doesn't say that the quantum computing approach is useless, rather that it could be looked at differently - giving access to N=270 processors without having to access each one individually. In fact, since the ECCT assumes polynomial rather than linear equivalence, we might get the equivalent of N2 processors or more - not to be sneezed at.