IBM claims it has produced a 127-qubit quantum computer. This is over double the size of comparable machines made by Google and the University of Science and Technology of China
15 November 2021
IBM has created the world’s largest superconducting quantum computer, surpassing the size of state-of-the-art machines from Google and from Chinese researchers. Previous devices have demonstrated up to 60 superconducting qubits, or quantum bits, working together to solve problems, but IBM’s new Eagle processor more than doubles that by stringing together 127.
Several approaches are being pursued by teams around the world to create a practical quantum computer, including superconductors and entangled photons, and it remains unclear which will become the equivalent of the transistor which powered the classical computing revolution.
In 2019, Google announced that its Sycamore processor, which uses the same superconducting architecture that IBM is working with, had achieved quantum supremacy – the name given to the point at which quantum computers can solve a problem that a classical computer would find impossible. That processor used 54 qubits, but has since been surpassed by a 56 and then 60-qubit demonstration with the Zuchongzhi superconducting processor from the University of Science and Technology of China (USTC) in Hefei.
IBM’s 127-qubit Eagle processor now takes the top spot as the largest, and therefore theoretically most powerful, superconducting quantum computer to be demonstrated. Each additional qubit represents a significant step forward in ability: unlike classical computers, which rise in power in a linear fashion as they grow, one additional qubit effectively doubles a quantum processor’s potential power.
Canadian company D-Wave Systems has sold machines for some years that consist of thousands of qubits, but they are widely considered to be very specific machines tailored towards a certain algorithm called quantum annealing rather than fully programmable quantum computers. In recent years, much progress in quantum computing has focused on superconducting qubits, which is one of the main technologies that Google, USTC and IBM are backing.
Bob Sutor at IBM says that breaking the 100-qubit barrier is more psychological than physical, but that it shows the technology can grow. “With Eagle, we’re demonstrating that we can scale, that we can start to generate enough qubits to get on a path to have enough computation capacity to do the interesting problems. It’s a stepping stone to bigger machines,” he says.
However, it is difficult to compare the power of the IBM chip with previous processors. Both Google and USTC used a common test to assess such chips, which was to simulate a quantum circuit and sample random numbers from its output. IBM claims to have created a more programmable and adaptable processor, but has yet to publish an academic paper setting out its performance or abilities.
Peter Leek at the University of Oxford says it is tempting to assess performance entirely on the qubit count, but that there are other metrics that need to be looked at – none of which has yet been released for Eagle. “It’s definitely positive, it’s good that they’re making something with more qubits, but ultimately it only becomes useful when the processor performs really well,” he says.
Scott Aaronson at the University of Texas at Austin has similar reservations about judging the importance of the new processor at this stage, saying that more detail is needed. “I hope that information will be forthcoming,” he says.
IBM has said that it hopes to demonstrate a 400-qubit processor next year and to break the 1000-qubit barrier the following year with a chip called Condor. At that point, it is expected that a limit on expansion will be reached that requires quantum computers to be created from networks of these processors strung together by fibre-optic links.
More on these topics: