Canada Kicks Ass
Scientists discuss quantum computing.

REPLY



GreatBriton @ Sun May 07, 2006 11:09 am

COMPUTING is about to hit a problem. In each new generation the components are smaller than they were in its predecessor, and the speed at which this miniaturisation is happening means that within 15 years or so a fundamental limit will be reached. At that point, not only will the strange effects of quantum mechanics hold sway, the components themselves will be on the scale of atoms and no further size-reduction will be possible. Which is why scientists and engineers are seeking new ways of building computers.

So, British scientists and engineers held at a meeting at the Royal Society in London to discuss the problem. They discussed the subject of "quantum computing."




Quantum computing

One qubit at a time
May 4th 2006
From The Economist print edition

Four ways to build a quantum computer

Image


COMPUTING is about to hit a problem. In each new generation the components are smaller than they were in its predecessor, and the speed at which this miniaturisation is happening means that within 15 years or so a fundamental limit will be reached. At that point, not only will the strange effects of quantum mechanics hold sway, the components themselves will be on the scale of atoms and no further size-reduction will be possible. Which is why scientists and engineers are seeking new ways of building computers.

One route they are exploring, which was discussed at a meeting held recently at the Royal Society in London, is called quantum computing. Instead of trying to overcome quantum weirdness, this technique embraces and exploits it. The thing that distinguishes a quantum computer from the sort in use today is the number of calculations it can do in parallel. Both sorts of computer use binary arithmetic, but they do so in rather different ways. A classical computer employs bits—binary digits, either zero or one—to process and store information. But a bit must be one or the other; it cannot be both at the same time. A quantum computer does not suffer from this restriction.

Quantum theory allows subatomic particles to exist in more than one state simultaneously, a phenomenon known as superposition. An electron, for example, has a property called spin that can be “up” or “down”—or a bizarre combination of the two. Using the spin of an electron to represent a bit of data would allow it to be both up and down (ie, zero and one) at the same time. Instead of being a bit it is, in the jargon, a qubit.

From bits to qubits
Unlike a bit, a qubit can—at least in theory, and if the program can be written in the correct way—be used in more than one calculation at a time. And if you add extra qubits, the process scales geometrically. A quantum computer with two qubits could run four calculations in parallel. A 20-qubit device could run over a million. A 1,000-qubit device could process more simultaneous calculations than there are particles in the observable universe.

It is this geometrical scaling that makes the potential of quantum computing so awesome. Besides disposing of the miniaturisation problem, quantum computing would, its proponents claim, allow the creation of computers that will massively out-perform existing machines. They will be able to do day-to-day calculations much faster and also solve problems that are currently intractable. They may even open up a new class of applications that can now be only guessed at, according to Lov Grover of Lucent Technologies, a researcher in the field.

If, that is, they can be made. So far, only small-scale devices have been demonstrated, and many of these need to be kept in strictly defined conditions.

One reason for this sensitivity is that qubits can maintain their quantum superposition only if they do not interact with other objects. They must thus be isolated from their surroundings. Andrew Briggs, a nanomaterials scientist at Oxford University, is a member of an international collaboration working on this problem. As he told the Royal Society meeting, his team recently managed to cage a nitrogen atom inside a buckyball (a sphere formed from 60 carbon atoms) and use its electrons as a single qubit.

The resulting molecule kept the qubit in a superposition for 500 nanoseconds—longer than any other molecular system studied. Unfortunately, this is still rather a short time (500 billionths of a second, to be precise), and is certainly not long enough to perform a calculation. To encourage the superposition to endure a little longer, the team repeatedly kicked the qubit with a pulse of microwaves, a technique known as “bang bang”. This disrupts any interaction between the qubit and its environment, and keeps the superposition in place a little longer.

So far, so good. And using buckyballs to isolate qubits could prove particularly useful, because atoms of elements other than nitrogen might also be caged this way. That opens up the possibility of finding more suitable materials for use as qubits, and employing properties other than the spin on their electrons to create superposition. The work, nevertheless, remains at the single-qubit scale, many years from commercial use.

A second approach, being taken by a team that includes David Williams, who works at Hitachi's research laboratory in Cambridge, may be closer to usefulness. Dr Williams's team wants to employ existing silicon chips to make quantum computers. These have the advantage of being a proven technology. The idea is to manufacture “quantum dots”—tiny blobs of material that would act as qubits—on their surfaces. The dots would then be manipulated by larger-scale structures to create something resembling a real computer, rather than a laboratory curiosity. Indeed, all the elements for a semiconductor qubit have now been made in the lab, and some elements have even been made to work together. A complete circuit, however, has yet to be demonstrated.

A third way to create a quantum computer uses ions (electrically charged atoms) trapped by oscillating electromagnetic waves as its qubits. Many groups are working on this idea. Last year, for example, a team of researchers at the University of Michigan built a semiconductor chip that functioned as an ion trap. Such devices are constructed by standard lithographic techniques, which makes them attractive as a way to manufacture real quantum computers.

Condensed thinking
Clinging to tried and trusted methods, though, may not be the right approach. After all, it was not the pre-existing technology of valves that made the original digital computers viable, but a new—and initially temperamental—invention called the transistor. Developing existing technology for use in quantum computers might prove equally mistaken. In this context, a relatively newly discovered form of matter called a Bose-Einstein condensate may point the way ahead.

In a Bose-Einstein condensate, the atoms are so cold that they all fall into their lowest quantum state. That means they are, in quantum terms, identical. And that, in turn, means they act as a single quantum object—ideal for turning into a qubit, if the minor difficulty of keeping them at temperatures close to absolute zero in a functioning machine can be overcome.

Several groups are working on this problem, including a team led by Ed Hinds of Imperial College, London. Dr Hinds makes his condensates by zapping the atoms with specially tuned lasers. This forces them to emit energy and thus shed heat, eventually cooling them into a quantum uniformity that can be trapped by an electromagnetic field.

What happens next is not clear. Dr Hinds has not yet worked out how to talk to his newly created qubit. Nor are Bose-Einstein condensates necessarily the next transistors. But quantum computing does now seem to be acquiring a momentum of its own. Give it 15 years, and who knows what will result.

economist.com

   



gstang23 @ Sun May 07, 2006 12:11 pm

8O


I was having this same conversation with someone at work the other day.

(ok I admit it. Im lieing)

But honestly I think most people have realized that there would have to be a total revamping in the way chips are designed. Instead of just going smaller and smaller on the transistors so they can put more of them on a chip, they need to rethink the way they work to be more effecient and with less roadblocks. Processors can do so many computations a second, but cant send the information because the bus system cant handle it all. Instead of making more routes (thus more parts and more chance of failure as well as the heat factor) they need to make those routes able to handle the information as it get its and not have to cache it till it can be squeezed through.

   



REPLY