A new algorithm increases the efficiency of quantum computers
Quantum computing is taking a new leap forward due to research that has proposed a scheme to reduce the number of calculations needed to read out data stored in the state of a quantum processor. This will make quantum computers more efficient, faster, and ultimately more sustainable.
Quantum computers have the potential to solve important problems that are beyond reach even for the most powerful supercomputers, but they require an entirely new way of programming and creating algorithms. Universities and major tech companies are spearheading research on how to develop these new algorithms.
In a recent collaboration between the University of Helsinki, Aalto University, the University of Turku, and IBM Research Europe-Zurich, a team of researchers developed a new method to speed up calculations on quantum computers. The results were published in the prestigious journal PRX Quantum of the American Physical Society.
‘Unlike classical computers, which use bits to store ones and zeros, information is stored in the qubits of a quantum processor in the form of a quantum state, or a wavefunction,’ says postdoctoral researcher Guillermo García-Pérez from the University of Helsinki, first author of the paper. Therefore, special procedures are required to read out data from quantum computers.
‘The quantum state used is, in fact, generally impossible to reconstruct on conventional computers, so useful insights must be extracted by performing specific observations (which quantum physicists refer to as measurements),’ says García-Pérez.
The problem with this is the large number of measurements required for many popular applications of quantum computers (for example, the Variational Quantum Eigensolver, which can be used to overcome important limitations in chemistry research, such as in drug discovery). The number of calculations required is known to grow very quickly with the size of the system being simulated, even if only partial information is needed. This makes the process hard to scale up, slowing down the computation and consuming a lot of computational resources.
The method proposed by García-Pérez and co-authors uses a generalized class of quantum measurements that are adapted throughout the calculation in order to extract the information stored in the quantum state efficiently. This drastically reduces the number of iterations, and therefore the time and computational cost, needed to obtain high-precision simulations.
Matteo Rossi, a postdoctoral researcher at Aalto, says that simulations on quantum computers have so far used straightforward measurements known as Pauli measurements. ‘Our work uses more general quantum measurements, which can be adjusted. The main challenge that we address is how to optimise these measurements efficiently, given that the best measurement depends on the state one is measuring, which is unknown beforehand. We solved the problem with an adaptive strategy,’ he explains.
The method can reuse previous measurement outcomes and adjust its own settings. Subsequent runs are increasingly accurate, and the collected data can be reused again and again to calculate other properties of the system without additional costs.
‘We make the most out of every sample by combining all data produced. At the same time, we fine-tune the measurement to produce highly accurate estimates of the quantity under study, such as the energy of a molecule of interest. Putting these ingredients together, we can decrease the expected runtime by several orders of magnitude,’ says García-Pérez.
Read the article: Learning to Measure: Adaptive Informationally Complete Generalized Measurements for Quantum Algorithms
Read the original news article here.
Contact information:
- Published:
- Updated: