RezQu is a family of devices and architecture for a scalable quantum computer based on superconducting phase qubits. RezQu is being developed by a team at the University of California, Santa Barbara led by John Martinis and Andrew Cleland. The team described their work at the American Physical Society meeting held on March 2011.
The 6cm-by-6cm chip holds nine quantum devices, among them four “quantum bits” that do the calculations. The team said further scaling up to 10 qubits should be possible this year. The team’s key innovation was to find a way to completely disconnect – or “decouple” – interactions between the elements of their quantum circuit. The delicate quantum states that they create must be manipulated, moved, and stored without destroying them. “It’s a problem I’ve been thinking about for three or four years now, how to turn off the interactions,” told John Martinis. “Now we’ve solved it, and that’s great – but there’s many other things we have to do.”
Rather than the ones and zeroes of digital computing, quantum computers deal in what are known as superpositions – states of matter that can be thought of as both one and zero at once. In a sense, quantum computing’s one trick is to perform calculations on all superposition states at once. With one quantum bit, or qubit, the difference is not great, but the effect scales rapidly as the number of qubits rises. The figure often touted as the number of qubits that would bring quantum computing into a competitive regime is about 100, so each jump in the race is a significant one.
The RezQu architecture is basically a blueprint for a quantum computer, and several presentations at the American Physical Society conference focused on how to make use of it. RezQu seems to have an edge in one crucial arena – scalability – that makes it a good candidate for the far more complex circuits that would constitute a proper quantum computer. The metric of interest to quantum computing is how long the delicate quantum states can be preserved, and Britton Plourde, a quantum computing researcher from the University of Syracuse, noted that time had increased a thousand fold since the field’s inception.
Source: original article