The Goal of Quantum Computing Beyond NISQ: Megaquop Milestone Says Preskill
The development of quantum computing is moving beyond its current noisy intermediate-scale stage, known as NISQ, towards a more robust and reliable era of fault-tolerant quantum computing. According to John Preskill, the originator of the term NISQ, this next stage will require using quantum error-correcting codes to achieve commercially viable applications. In a recent talk at Q2B, John Preskill spoke about what comes after the NISQ era.
Companies such as Google, IBM, and Microsoft are working on advancing quantum error correction technologies, with Google recently demonstrating improved transmon lifetime and measurement error correction in its Willow processor. Researchers at Harvard, MIT, and Yale are also making progress in developing new qubit technologies, including cat qubits and fluxonium, which could lead to more robust and scalable quantum computing.
The goal is to create a megaquop machine capable of executing millions of quantum operations, which could have significant implications for chemistry and materials science fields.
Introduction to Quantum Computing Beyond NISQ
The current state of quantum computing is characterized by Noisy Intermediate-Scale Quantum (NISQ) devices, which are valuable for scientific exploration but lack commercially viable applications due to the absence of demonstrated quantum advantage over classical hardware. The future of quantum computing lies in the development of fault-tolerant quantum computing, which will enable the creation of reliable and scalable quantum machines.
One such machine is the Megaquop Machine, a proposed device that can execute circuits with millions of quantum operations. Significant advances are needed in hardware, control, algorithms, error correction, and error mitigation to achieve this goal.
The development of fault-tolerant quantum computing requires the implementation of quantum error correction codes, which can detect and correct errors that occur during quantum computations. These codes are essential for achieving reliable and scalable quantum computing, as they enable high fidelity creation of logical qubits.
The surface-code threshold is a critical milestone in the development of quantum error correction, as it represents the point at which the error rate of a quantum computation can be reduced to an arbitrarily low level. Recent progress in quantum error correction has brought us closer to achieving this goal, with advances in codes such as repetition and concatenated codes.
Advances in Quantum Error Correction
Significant progress has been made in quantum error correction, with several groups demonstrating the ability to correct errors in small-scale quantum systems. For example, Google has been shown accurate real-time decoding of error syndromes for distance 3 and 5 codes.
At the same time, Riverlane and Rigetti have integrated field-programmable gate arrays (FPGAs) into their control system to improve latency for small codes. Harvard and QuEra have also developed a “correlated decoding” approach that can reduce circuit depth overhead by decoding across multiple code blocks. These advances demonstrate the potential for quantum error correction to enable reliable and scalable quantum computing.
Other approaches to quantum error correction include using cat qubits with highly biased noise, dual-rail qubits for erasure detection, and GKP qubits/qutrits/ququarts in resonators. These approaches offer promising alternatives to traditional quantum error correction codes and may improve performance and simplicity. For example, fluxonium has been shown to exhibit strong anharmonicity and high fidelity, with 2Q fidelity greater than 0.999. These advances demonstrate the diversity of approaches explored in quantum error correction and highlight the potential for innovation in this field.
Toward the Megaquop Machine
The development of the Megaquop Machine requires significant advances in hardware, control, algorithms, error correction, and error mitigation. High-rate codes such as the Bivariate Bicycle (BB) code offer promising approaches to achieving high-fidelity quantum computing, while logical error mitigation techniques can boost the simulable circuit volume.
Additionally, advances in chemistry and materials science may enable the creation of more efficient and scalable quantum systems. For example, fewer qubits may be required for certain chemical simulations, while translation invariance can enable parallel operation and reduce circuit depth.
The Megaquop Machine represents a reachable near-term goal for the quantum community, with potential uses ranging from simulation and optimization to machine learning and materials science.
However, significant challenges must be overcome to achieve this goal, including the development of more efficient error correction codes, improved control and calibration techniques, and innovative approaches to algorithm design. The progress made in the megaquop era will guide our path to gigaquops, teraquops, and beyond, enabling the creation of increasingly powerful and reliable quantum machines.
Challenges and Opportunities
The Megaquop Machine’s development presents challenges and opportunities for the quantum community. One of the primary challenges is the need for innovation at all stack levels, from hardware and control to algorithms and error correction. This requires a co-design approach, where basic science and systems engineering advances are closely integrated. Additionally, developing more efficient error correction codes and improved control and calibration techniques will be essential for achieving reliable and scalable quantum computing.
Despite these challenges, the potential rewards of developing the Megaquop Machine are significant. A machine capable of executing millions of quantum operations could enable breakthroughs in fields such as chemistry, materials science, and optimization, with potential applications ranging from drug discovery to climate modeling. The development of the Megaquop Machine represents a compelling challenge for the quantum community, requiring innovation and collaboration across disciplines to achieve this ambitious goal.
NISQ -> Megaquop
In conclusion, the future of quantum computing lies in the development of fault-tolerant quantum computing, which will enable the creation of reliable and scalable quantum machines. The Megaquop Machine represents a reachable near-term goal for the quantum community, with potential uses ranging from simulation and optimization to machine learning and materials science. To achieve this goal, significant advances are needed in hardware, control, algorithms, error correction, and error mitigation, but the potential rewards are substantial. As we progress in the development of quantum computing, it is essential to address the challenges and opportunities presented by the Megaquop Machine, driving innovation and collaboration across disciplines to achieve this ambitious goal.
More information
External Link: Click Here For More