Quantum computers have the potential to revolutionize drug discovery, materials design, and fundamental physics—if we can ever get them to work reliably. Today, artificial intelligence (AI) is also expected to revolutionize quantum computers . Some time ago, Google released AlphaQubit, a Transformers-based decoder that can identify quantum computing errors with state-of-the-art accuracy , accelerating the progress of building reliable quantum computers. Google CEO Sundar Pichai wrote on X, "AlphaQubit uses Transformers to decode quantum computers, achieving a new state of the art in quantum error correction accuracy . The intersection of AI and quantum computing is exciting." The related research paper, titled “Learning high-accuracy error decoding for quantum processors”, has been published in the authoritative scientific journal Nature. Accurately identifying errors is a key step in enabling quantum computers to perform long-duration calculations on a large scale , opening the door to scientific breakthroughs and discoveries in many new fields. Correcting quantum computing errors Quantum computers exploit the unique properties of matter at the smallest scales, such as superposition and entanglement, to solve certain types of complex problems in far fewer steps than classical computers. The technology relies on quantum bits (or qubits), which can sift through a vast number of possibilities using quantum interference to find an answer. The natural quantum state of a qubit can be corrupted by all sorts of factors: tiny defects in the hardware, heat, vibrations, electromagnetic interference and even cosmic rays (which are everywhere). Quantum error correction offers a solution by using redundancy: multiple qubits are combined into a single logical qubit, which is periodically checked for consistency. The decoder uses these consistency checks to identify errors in the logical qubits, thereby preserving the quantum information, and corrects them. Figure | How 9 physical qubits (small gray circles) in a qubit grid with a side length of 3 (code distance) form a logical qubit. At each time step, 8 more qubits perform consistency checks (square and semicircular areas, blue and magenta when failed, gray otherwise), providing information to AlphaQubit. At the end of the experiment, AlphaQubit determines what errors occurred. Creating a Neural Network Decoder AlphaQubit is a neural network-based decoder that borrows from Transformers, a deep learning architecture developed at Google that underlies many of today’s large language models (LLMs). Using a consistency check as input, its task is to correctly predict whether the measurement result of a logical qubit at the end of the experiment is the opposite of what it was when it was prepared. The research team first trained the model to decode data from a set of 49 qubits in the Sycamore quantum processor (the central computing unit of a quantum computer). To teach AlphaQubit to solve general decoding problems, they used a quantum simulator to generate hundreds of millions of examples across a variety of settings and error levels. Then, they fine-tuned AlphaQubit for a specific decoding task by feeding it thousands of experimental samples from a specific Sycamore processor. When tested on the new Sycamore data, AlphaQubit set a new standard in accuracy compared to previous leading-edge decoders. In the largest Sycamore experiment, AlphaQubit achieved 6% lower error rates than tensor network methods, which are highly accurate but very slow. AlphaQubit also achieved 30% lower error rates than correlation matching, an accurate decoder that is fast enough to scale. Figure | Decoding accuracy for small and large Sycamore experiments (distance 3 = 17 physical qubits, distance 5 = 49 physical qubits). AlphaQubit is more accurate than tensor networks (TNs, an approach that does not scale to large experiments) and correlation matching, an exact decoder that scales in speed. Generalizing beyond the training data To understand how AlphaQubit would scale to larger devices with lower error levels, the team trained it using data from simulated quantum systems with up to 241 qubits, which was beyond the capabilities of the Sycamore platform. Likewise, AlphaQubit outperformed the leading algorithm decoder, suggesting that it could also be used in medium-sized quantum devices in the future. Figure | Decoding accuracy for different scales/simulations from distance 3 (17 qubits) to distance 11 (241 qubits). The tensor network decoder is not shown in this figure because it is too slow to operate at large distances. The accuracy of the other two decoders increases with distance (i.e. using more physical bits). At every distance, AlphaQubit is more accurate than correlation matching. AlphaQubit also demonstrates advanced capabilities such as accepting and reporting confidence levels for inputs and outputs. These information-rich interfaces could help further improve the performance of quantum processors. When the research team trained AlphaQubit on samples containing up to 25 rounds of error correction, it maintained good performance in simulations for up to 100,000 rounds, demonstrating its ability to generalize to scenarios beyond the training data. Real-time error correction still needs to be accelerated Google says AlphaQubit is an important milestone in quantum error correction using machine learning. But they still face significant challenges in speed and scalability. For example, in a fast superconducting quantum processor, each consistency check is measured a million times per second. While AlphaQubit is excellent at accurately identifying errors, it is still too slow to correct them in real time in a superconducting processor. As quantum computing develops, commercial applications may require millions of qubits, which will urgently require more data-efficient methods to train AI-based decoders. Compiled by: Academic Jun |
<<: Fan-shaped ears, long nose, what you see is just my appearance
How did Chinese scientists bring down the price o...
I have been learning product-related knowledge be...
In the hot summer, the sun is blazing in the sky,...
Xinhua News Agency, Urumqi, May 30 (Reporters Li ...
"The landing site system will play an import...
Written by: Liu Fang Editor: Wang Haha Layout: Li...
1. Competitive product analysis 1. Choose competi...
1 Introduction There is an eight-second rule in t...
Recently, a netizen complained to Clippings that ...
Today's Internet supervision is very strict, ...
2022 is the second year that my country implement...
Before reading this article, we need to have a co...