Shopping? Check out our latest product comparisons

Scientists successfully manipulate qubits with electrical fields

By

December 26, 2010

Scanning electron image of the nanowire device with gate electrodes used to electrically c...

Scanning electron image of the nanowire device with gate electrodes used to electrically control qubits, and source and drain electrodes used to probe qubit states

Image Gallery (3 images)

Until now, the common practice for manipulating the electron spin of quantum bits, or qubits, – the building blocks of future super-fast quantum computers – has been through the use of magnetic fields. Unfortunately, these magnetic fields are extremely difficult to generate on a chip, but now Dutch scientists have found a way to manipulate qubits with electrical rather than magnetic fields. The development marks yet another an important development in the quest for future quantum computers, which would far outstrip current computers in terms of speed.

Just like a normal computer bit, a qubit can adopt the states ‘0’ and ‘1’. One way to make a qubit is to trap a single electron in semiconductor material. It’s state can be set by using the spin of an electron, which is generated by spinning the electron on its axis. As it can spin in two directions, one direction represents the ‘0’ state, while the opposite direction represents the ‘1’ state.

Until now, the spin of an electron has been controlled by magnetic fields but the scientists from the Kavli Institute of Nanoscience at Delft University of Technology and Eindhoven University of Technology have now succeeded in controlling the electron spin in a qubit with a charge or an electric field.

According to Leo Kouwenhoven, scientist at the Kavli Institute of Nanoscience at TU Delft this form of control has major advantages. "These spin-orbit qubits combine the best of both worlds. They employ the advantages of both electronic control and information storage in the electron spin," he said.

In another important quantum computing development, the scientists have also been able to embed these qubits into semiconductor nanowires. The scientists were able to embed two qubits in nanowires measuring just nanometers in diameter and micrometers in length made of indium arsenide.

"These nanowires are being increasingly used as convenient building blocks in nanoelectronics. Nanowires are an excellent platform for quantum information processing, among other applications," said Kouwenhoven.

The scientists’ findings appear in the current issue of the journal Nature.

About the Author
Darren Quick Darren's love of technology started in primary school with a Nintendo Game & Watch Donkey Kong (still functioning) and a Commodore VIC 20 computer (not still functioning). In high school he upgraded to a 286 PC, and he's been following Moore's law ever since. This love of technology continued through a number of university courses and crappy jobs until 2008, when his interests found a home at Gizmag.   All articles by Darren Quick
3 Comments

Is this going to be like cold fusion where it works but no one really gives a damn? Seriously we already have way more computing power then we need with our current technology. Short of real artificial technology and super computers who really needs this?

From what I have read no one things this is going to be a game changer in the next 20 years. 20 years ago I bet they predicted that laser discs would be the next big thing. Heck 10 years ago I really thought Mini Discs had a chance.

Yes I understand the theory it is possible. I just don't see it. I think its a lot of really smart doing research with other peoples money. In the end they won't make a thing.

I liken this to someone seeing an ape that knows sign language and predicting that in the future we will have monkey butlers. What works in a lab most of time doesn't matter to the world.

the limitation of our current technology is power consumption of screens, radio transmitters and the batteries that run them. For very little money you could tipple the processing power of an Ipad, and only show a slight decrease in battery life. I'm sorry where is the money for new technology? where is the need for more processing power?

I would love to be proven wrong, luckily this we

Michael Mantion
28th December, 2010 @ 01:45 am PST

@Michael Mantion

Everything you say is wrong.

Cold fusion never worked.

If it would have worked it would have been revolutionary.

"Artificial technology" is nonsense, technology is always artificial by definition.

Computing power is basic infrastructure. The demand is much higher than what's available.

Laser discs and Mini discs are a format, not a whole realm of technologies.

And by the way, improve your English, a lot of your sentences are just complete gibberish.

Sta2think
16th January, 2011 @ 10:48 am PST

Some one once said desktop computers would be a fad that wouldnt last - there would possibly be a call for a few hundred world wide.

When I started working with a Mainframe ICL 1900 in 68/9 it had 16k of memory and used Tapes almost as thick as my two fingers to store info. Within a year they had installed a Hard drive which took, I think, a stack of 4 disks almost 600 cm across and would take quite an effort to carefully put on the reader. Within 3 years it was upgraded to 32k and the first payroll had been sent by phone line from a subsidiary part of the business.

Think how far we have come in 40 years. The computer I use now has more power than all the business computers in the British Isles at that time. On the down side I posses reel to reel tapes with info from that time and earlier that I can no longer access. I still have a"floppy " somewhere absolutely no use whatsoever. I cant store even one image on it if I has the hardware to use it. So don't say we have no use for all the discoveries coming up.

Meryl Moscrop
18th January, 2011 @ 01:57 am PST
Post a Comment

Login with your gizmag account:

Or Login with Facebook:


Related Articles
Looking for something? Search our 28,280 articles