Researchers develop faster method to measure quantum data loss
Updated
Updated · ScienceDaily · May 3
Researchers develop faster method to measure quantum data loss
2 articles · Updated · ScienceDaily · May 3
Jeroen Danon at Norway's NTNU said the technique, built with a Niels Bohr Institute-led team in Copenhagen, cuts measurement time to about 10 milliseconds from roughly one second.
The method tracks fluctuating relaxation rates in superconducting qubits almost in real time, exposing rapid changes that earlier tests missed and helping identify why quantum information disappears.
Researchers say better diagnostics could improve tuning of quantum processors and support efforts to make still-unstable quantum computers more reliable and practical.
Now that scientists can see quantum errors in real-time, how quickly can they actively correct them?
How will real-time qubit monitoring accelerate the race for a fault-tolerant quantum computer?
Could this breakthrough reveal that current qubits are too unstable, forcing a complete pivot in quantum hardware design?
Unveiling Qubit Instability: How 100x Faster Measurements Expose Two-Level System Defects
Overview
In April 2026, Professor Jeroen Danon's team developed a groundbreaking measurement technique that tracks qubit relaxation 100 times faster, reducing measurement time from one second to just 10 milliseconds. This real-time method revealed rapid fluctuations in qubit stability caused by microscopic Two-Level Systems (TLSs), which disrupt quantum information through spectral diffusion. The breakthrough relies on an advanced FPGA-Bayesian adaptive tracking system that enables dynamic calibration and enhances quantum error correction, accelerating the development of practical quantum hardware. However, scaling this technology to thousands of qubits remains challenging due to unpredictable fluctuations and the need to monitor every qubit continuously, highlighting ongoing engineering and scientific hurdles.