To the untrained eye, a circuit built with IBM’s online Quantum Experience tool looks like something out of an introductory computer-science course. Logic gates, the building blocks of computation, are arrayed on a digital canvas, transforming inputs into outputs.

But this is a quantum circuit, and the gates modify not the usual binary 1 or 0 bits, but qubits, the fundamental unit of quantum computing. Unlike binary bits, qubits can exist as a ‘superposition’ of both 1 and 0, resolving one way or the other only when measured. Quantum computing also exploits properties such as entanglement, in which changing the state of one qubit also changes the state of another, even at a distance.

Those properties empower quantum computers to solve certain classes of problem more quickly than classical computers. Chemists could, for instance, use quantum computers to speed up the identification of new catalysts through modelling.

Yet that prospect remains a distant one. Even the fastest quantum computers today have no more than 100 qubits, and are plagued by random errors. In 2019, Google demonstrated that its 54-qubit quantum computer could solve in minutes a problem that would take a classical machine 10,000 years. But this ‘quantum advantage’ applied only to an extremely narrow situation. Peter Selinger, a mathematician and quantum-computing specialist at Dalhousie University in Halifax, Canada, estimates that computers will need several thousand qubits before they can usefully model chemical systems.

“The stage of quantum computers now is something like classical computing in the late 1980s,” says Sara Metwalli, a quantum-computing researcher at Keio University in Tokyo. “Most of the work done now is to prove that quantum, in the future, may have the ability to solve interesting problems.”

How To Get Started In Quantum Computing