Cosmos: Where are we at with quantum computing?
26 September 2023 by Diraq

This article was originally published by Cosmos as "Where are we at with quantum computing?".

Aberdeen, Maryland in the late 1940s was an exciting place to be. They had a computer so powerful and so energy intensive that there were rumours that when it switched on, the lights in Philadelphia dimmed.

The computer – called the ENIAC – took up an area almost the size of a tennis court. It needed 18,000 vacuum tubes and had cords thicker than fists crisscrossing the room connecting one section to another.

Despite its size, today it’s less impressive. Its computing power would be dwarfed by a desk calculator.

Professor Tom Stace, the Deputy Director of the ARC Centre of Excellence in Engineered Quantum Systems (EQUS) believes that quantum computing is best thought of not as computers like we know them today, but as big lumbering systems like the ENIAC.

“ENIAC was the first digital computer,” said Stace.

“You see engineers programming, but that meant literally unplugging cables and plugging them into these gigantic room-size things. That’s sort of what a quantum computer looks like now. It’s literally bolt cables that people have to wire up and solder together.”

To understand where we’re at with quantum computing currently, you first have to understand their potential.

The Quantum Hype

Computers are simply devices that can store and process data. Even the earliest computers used ‘bits’, a basic unit of information that can either be on or off.

Quantum computers are also devices that can store and process information, but instead of using bits, quantum computers use quantum bits or ‘qubits’, which don’t just turn on and off but also can point to any point in between.

The key to quantum computer’s huge potential – and also problems – are these qubits.

Groups like IBM and Google have spent millions of dollars on creating quantum computers, no doubt buoyed by the riches for the company that comes first.

Their efforts so far have been relatively lacklustre.

The machines are clunky, each wire and qubit need to be individually placed or set up manually. The whole thing needs to be set up inside a freezer cooled down to almost absolute zero.

Despite all these safeguards the machines still have enough errors that it’s almost impossible to tell if the machines worked, or if these million-dollar systems are just producing random noise.

And even that is impressive to scientists like Stace.

“Twenty years ago, if you had one qubit you got a Nature paper. Fifteen years ago, two or three qubits got you a Nature paper. Ten years ago, five qubits got you a Nature paper. Now, 70 qubits might get your Nature paper,” says Stace.  

“That’s telling you what the frontier looks like.”

Those on the frontier are aiming for supremacy – “quantum supremacy” to be exact.

Quantum supremacy is a term given to a quantum computer that could solve a problem no classical computer could solve in a reasonable time frame. It’s important to note though that this problem doesn’t have to be useful. There’s been a debate in quantum circles about how useful and practical these sorts of problems, or simulations, actually are to prove quantum is better.

Google’s machine – called the Sycamore processor – has currently got 70 qubits all lined up and connected. In 2019, the researchers had claimed they’d reached ‘quantum supremacy.’ More recently, they went more specific – suggesting that a top-level supercomputer would take 47 years to do the calculations that Sycamore managed to do in seconds.

IBM says its  433-qubit quantum computer called Osprey could soon start having real-world applications. However, while IBM is further ahead in number of qubits, it is still struggling with the same error issues as other quantum systems.

To get to a quantum computer that could rival supercomputers at actual tasks, you need hundreds of thousands, or millions of qubits rather than a few hundred. But the more qubits you have the more errors that end up in the system.

“Quantum systems are typically single atoms or single particles of light. Naturally, these are very fragile and very prone to disturbance or noise,” says UNSW quantum researcher and entrepreneur Professor Andrew Dzurak.

“That noise causes errors in the qubit information.”

Heat also causes errors; vibration causes errors. Even just simply looking or measuring the qubit stops it altogether.

Both Dzurak and Stace stress the importance of fixing these errors. Without it, you have a very expensive, fragile machine that can’t tell you anything accurately.

The Final Frontier

How to fix these errors isn’t yet certain. While IBM, Google and other big companies are using superconducting qubits, smaller groups around the world are using everything from silicon to imperfections in diamond.

Dzurak has formed a start-up called Diraq which is aiming to use traditional computer chip technology to mount the qubits, allowing easier design and the ability to pack millions of qubits on one chip.

“We have a mountain to climb, and you have to go through the stages to get up that mountain,” he says.

“The work that is being done by [IBM and Google] in collaboration,  often with university groups is important research and is moving the field forward.”

Entanglement is another important aspect of quantum computers which makes them infinitely harder to make work. A quirk in quantum mechanics is that particles can become intrinsically linked, despite their distance. This means that if you measure one particle you can tell information about the other, even if you’re halfway across the Universe. This is entanglement, and the more and more particles you can entangle, the more powerful your quantum computer can be.

But the more particles you entangle, the more complicated the system becomes, and the more likely it will break down.

Here the history of computers seems to be repeating.

While ENIAC in Maryland was an undisputed success, it wasn’t the first design of a computer, not by a long shot. The first design of a computer – called the differential engine – was designed by a mathematician Charles Babbage in the 1820s.

But it wouldn’t be built in Babbage’s lifetime.

Using only the technology available, it was impossible to fine tune the metal precisely enough to build the machine. It was doomed to fail from the start.

It wasn’t until an invention of something seemingly unrelated – vacuum tubes or valves – that ENIAC and other types of computers could begin being built in earnest.

It’s a hard thing to admit, but when it comes to quantum computers, we don’t yet know whether we’re building the ENIAC or struggling with Babbage’s differential engine.

“It might be the case that the components that we’re pursuing now aren’t just precise enough, in the same way that the machining tools that they had in the 19th century weren’t precise enough to make a mechanical computer,” says Stace.

So where are we at with quantum computing? Not very far at all.

“It could be that we’re somewhere between Charles Babbage and the valve. We’ve got the idea, we know in principle we can make this thing. We just don’t know if we have the engineering chops to do it.”

This article was originally published by Cosmos as "Where are we at with quantum computing?".

About Diraq
Engineering
Roadmap
The People
Intellectual Property
Newsdesk
Job Listings
Learn More
logo
2024 Diraq. All rights reserved |
Terms & Conditions
|
Privacy Policy
|