IBM's latest quantum computing progress: Is quantum computing's chatGPT moment coming soon?

IBM's latest quantum computing progress: Is quantum computing's chatGPT moment coming soon?

In June 2023, IBM Quantum and its partners, including the University of California, Berkeley, RIKEN, and Lawrence Berkeley National Laboratory, published a cover paper in Nature. Through the "error mitigation" method, they accurately obtained the operation results of complex quantum circuits on a 127-qubit processor. In the strong entanglement range, this circuit can no longer be brute-force simulated by classical computers. Many people believe that this is another milestone in the field of quantum computing. So, what is error mitigation? What has IBM done and what has it not done? This article will try to give an interpretation.

Written by Jin Yirong (Beijing Institute of Quantum Information Science)

At the end of 2022, I was invited by the journal Physics to translate an interview with Jay M. Gambetta, Vice President of IBM's Quantum Division. The English title was "Turning a Quantum Advantage". How to translate this title elegantly and faithfully really took me some time to think about it. In the end, my editor Mr. Yang and I unanimously chose "Lighting up the Quantum Advantage" (see "Lighting up the Quantum Advantage丨Interview with Vice President of IBM's Quantum Division"). As the saying goes, there have always been talented people in the comment section. If readers have better translations, please feel free to leave a comment.

While translating, I was shocked by some of Gambetta's remarks. On the one hand, he mentioned some figures, including the decoherence time, which he said had reached 100 milliseconds and would soon reach 300 milliseconds. I once thought that the reporter had made a mistake; and the two-bit gate fidelity, which had reached 99.9% and would reach 99.99% by the end of 2023. I work on quantum hardware, and these two figures were enough to shock me, and the following content was even more amazing.

First, he said, "It will be more important to do things in a smarter way than to pile up indicators." In other words, whether we can achieve quantum computing advantages in the future, it is not enough to just continuously improve technical indicators, such as the number of bits, decoherence time, gate fidelity, etc. We need to think about how to expand and engineer from the architectural level, introduce new methods to deal with the inevitable errors of quantum computers, and so on.

Secondly, when talking about quantum error correction, he said that they are conducting research on error mitigation methods, building a large number of circuit examples for representative error models, and then sampling the evolution results of these circuits, and using statistical methods to learn the error behavior of the entire quantum system, in order to give an error-free estimate of the quantum circuit. If the accuracy of this error-free estimate continues to approach 1, then wouldn’t we have achieved quantum error correction? With this idea in mind, quantum error correction will no longer be a leapfrog and arduous challenge (see "The Next Super Challenge of Quantum Computing"), but a gradual process, like hiking up a mountain, a small step, but when the sunset passes, you may already be at the top of the mountain when you look back.

Half a year later, IBM published a paper titled "Evidence for the utility of quantum computing before fault tolerance" in Nature, which instantly caused a great response in academia and industry. 100+ quantum bits, no need for quantum error correction, beyond classical computing, new milestone, these words all firmly grasp the readers' attention. Perhaps this is another highlight moment in the development of quantum computing since Google's "quantum supremacy". After reading the paper carefully, I recalled some of Gambetta's views in the interview, and I was a little focused: Gambetta has clearly expressed the ideas in the paper, and I translated it into Chinese half a year ago to introduce it to domestic readers. At this time, the paper shocked the audience, and everyone sat up in shock. It turns out that quantum computing can be played like this...

IBM's achievements appear on the cover of Nature's June 15th issue | Source: Nature

In any case, I still hope to interpret this work as calmly as possible with my professional knowledge. This time, IBM researchers and collaborators demonstrated the time evolution of the Trotter expansion of a two-dimensional transverse field Ising model (with the same topological connection as the quantum chip) on a 127-bit quantum processor [Note 1] , and made an "accurate" [Note 2] zero-noise extrapolation estimate of the evolution result through the zero-noise extrapolation (ZNE) error mitigation method . The entire circuit involves 127 quantum bits, up to 60 layers of two-bit gates, and a total of 2880 CNOT gates. In the case of strong entanglement, the classical tensor network approximation method can no longer give correct results. In other words, it is beyond the ability of classical brute force simulation.

The Trotter time evolution of the two-dimensional transverse field Ising model implemented on a 127-bit quantum processor (a, b), and how to calibrate the errors in the entire system (c, d) | Image source: Reference [1]

The paper explains quantum advantage: quantum advantage can be achieved in two steps. First, accurate calculations that exceed the classical brute force simulation capabilities are achieved on existing quantum hardware facilities, and then (valuable) problems are found on this basis to achieve accurate estimates of quantum circuits related to the problems (I use "estimate" instead of calculation here, because on noisy quantum circuits, what can be given is always statistical results). The work involved in the paper has completed the first step, so strictly speaking, quantum advantage has not been achieved.

However, this work is still a step forward compared to Google's "quantum supremacy". It is not because of the larger number of bits, greater circuit depth, and more two-bit gates, but because the random circuit sampling performed by Google at that time gave extremely low fidelity, while IBM's work this time, through error mitigation methods, can accurately give a biased estimate of a complex quantum circuit [Note 3]. This gives a strong expectation for the performance of noisy quantum computers. As long as we take one more step forward and replace the two-dimensional transverse field Ising model evolution circuit used this time with a quantum circuit related to a valuable problem, although this step is still difficult, the quantum advantage will be truly established.

So what is this error mitigation method that can turn decay into magic? You have to know that with a scale of 100+ bits and 60 layers of circuits, even if the average fidelity of manipulation and reading reaches more than 99%, the probability of getting the correct result is almost zero. IBM used a method called "zero noise extrapolation". Specifically, the researchers used the so-called sparse Pauli-Lindblad model to learn system errors; by adjusting the parameters, different noise gains G can be achieved, and a large number of noise line instances under different gains can be sampled and their expected values ​​calculated. Further, the expected values ​​under different noise gains are used to extrapolate the expectations when G=0 (that is, the case of no noise). In this way, it is equivalent to deriving the results under error-free conditions. Readers who have studied numerical calculations probably know that compared with interpolation, extrapolation is often unreliable, especially when it is far away from the true value point. To this end, IBM tested both exponential extrapolation and linear extrapolation, and performed a quantum-classical comparison with a specific case that can be simulated classically (when all gates in the circuit are converted into Clifford gates). The results were highly consistent, which is why IBM claims that this method can give accurate calculation results.

The blue dots are data points after error mitigation, and the green dots are data points without error mitigation. The pink and orange lines are the calculation results using the MPS and isoTNS tensor network approximation methods, respectively. Image source: Reference [1]

In addition, the researchers also compared the runtime of quantum computers with the tensor network method. In fact, tensor networks can no longer give accurate expectations when dealing with deep circuits. On the other hand, executing the same circuit, the tensor network method takes 8 hours and 30 hours to obtain a data point (corresponding to two evolutionary models), while the quantum runtime is 4 hours and 9.5 hours. During this time, the real quantum processor only runs for 5 minutes and 7 seconds, and the runtime can be further reduced by reducing the quantum bit reset time. In other words, there is still huge room for improvement in the runtime of quantum computers.

Of course, error mitigation methods come at a cost. Zero-noise extrapolation has significantly reduced sampling overhead compared to the previously proposed probabilistic error elimination, and can handle complex quantum circuits with 100+ qubits. However, according to the information revealed so far, this overhead is still growing exponentially with the increase in the scale of quantum systems. There are still challenges in how to efficiently mitigate errors in future larger-scale quantum processors.

The successful verification of this method is like a ray of light shining into the era of noisy quantum computing. There is still a lot of work to be done to make quantum computing productive. On the one hand, we need to further improve the performance of quantum hardware. The article mentioned that the fidelity of two-bit gates needs to be improved by "orders of magnitude", and the running speed also needs to be greatly improved. On the other hand, how to further verify the effectiveness of noise mitigation/elimination algorithms for heuristic quantum algorithms that are currently receiving more attention, including quantum chemical calculations, approximate optimization, etc., is also urgently needed.

Let's come back to the interview with Gambetta. When asked when quantum computing will beat classical computing, he said something that I admire. He said that instead of distinguishing between classical and quantum, and opposing the two, and looking forward to a moment when quantum defeats classical, it is better to stand from a more general perspective and unify the two. Computation is computing. The actual situation is that quantum computing requires a lot of classical computing assistance. The error mitigation method mentioned above is a typical example. What we are really pursuing is the runtime efficiency of solving complex problems. Classical computing assists quantum computing, and quantum computing in turn helps classical computing. The two are originally an indistinguishable unity. We need to look at quantum computing from a higher perspective.

Finally, it is worth mentioning that high-quality quantum resources are extremely valuable. IBM's work was completed on a quantum cloud platform codenamed "ibm_kyiv", and the chip used was the "Eagle_r3" 127-qubit processor. The median decoherence time T1 and T2 of this processor are 288 microseconds and 127 microseconds, respectively, reaching unprecedented levels. The CNOT gates between adjacent bits are calibrated by cross-resonance interaction (CR). Thanks to the high decoherence time and other performance, the median control fidelity of the two-bit gate exceeds 99%, and the median reading fidelity also exceeds 99%. This is an important hardware condition for the convergence of error mitigation methods. The further development of quantum hardware certainly depends on the core hardware team to advance, but how to give full play to the effectiveness of these noisy quantum hardware requires extensive intellectual participation, and requires the joint participation of talents from multiple disciplines such as mathematics, statistics, computing, informatics, and software. The best way to encourage this kind of extensive high-intelligence collaborative innovation is to share the best quantum resources - through the quantum cloud computing platform, IBM has been doing this.

Unfortunately, these top quantum computing resources are no longer open to China, but the good news is that our own 100+ scale quantum computing cloud platform has been launched and is open to the world! As more and more people in China participate in it, and as the expected demand for quantum applications continues to increase, I believe that China's quantum advantage "critical moment" will accelerate.

Notes

[1] Trotter means small steps, which here refers to approximately splitting the time evolution of a quantum system into many small steps in order to deal with the evolution of complex quantum systems.

[2] “Accurate” is in quotation marks because the circuit operation results can only be simulated under certain specific circumstances. They cannot be simulated by brute force in the strongly entangled region. The tensor network method also obtains approximate results and cannot be used as a classical verification of the calculation results. In short, the accuracy of the experimental results in the strongly entangled region is inferred based on the classical verification in the simulatable region.

[3] A biased estimate is one in which there is a systematic error between the estimated value obtained from the sample value and the true value of the parameter to be estimated, and its expected value is not the true value of the parameter to be estimated.

References

1. Kim, Y., Eddins, A., Anand, S. et al. Evidence for the utility of quantum computing before fault tolerance. Nature 618, 500–505 (2023).

2. Quafu quantum cloud platform: quafu.baqis.ac.cn;

3. QuantumCTek Cloud Platform: quantumctek-cloud.com

This article is supported by the Science Popularization China Starry Sky Project

Produced by: China Association for Science and Technology Department of Science Popularization

Producer: China Science and Technology Press Co., Ltd., Beijing Zhongke Xinghe Culture Media Co., Ltd.

Special Tips

1. Go to the "Featured Column" at the bottom of the menu of the "Fanpu" WeChat public account to read a series of popular science articles on different topics.

2. Fanpu provides a function to search articles by month. Follow the official account and reply with the four-digit year + month, such as "1903", to get the article index for March 2019, and so on.

Copyright statement: Personal forwarding is welcome. Any form of media or organization is not allowed to reprint or excerpt without authorization. For reprint authorization, please contact the backstage of the "Fanpu" WeChat public account.

<<:  Zhejiang Asian Games | Plants also exercise series: Sensitivity causes "ADHD"

>>:  Can a healthy mole turn into a tumor?

Recommend

Intel and Micron achieve breakthrough 10TB solid-state drive

In the notebook computer industry, solid-state dr...

How to play with short video information flow in 2019?

If you are a marketer and you haven’t heard of sh...

Tips for promoting Internet products on Zhihu!

Today I would like to share with you my experienc...

Practical examples: one article explains user growth in detail! !

The concept of User Growth (UG) originated from t...

It is a striking orange color, why is it called a "black box"?

Black Box Flight Data Recorder Can record the air...