Photoniques Magazine No. 131 | Page 72

BACK TO BASICS
QUANTUM computing
Figure 3. The variational quantum eigensolver algorithm... and its challenges. Left: VQE is a hybrid algorithm where the quantum processor( top) prepares a parameterized quantum state and measures the average values of the various Pauli terms that are contained in the Hamiltonian of the problem. These averages are combined into an estimate of the energy, which is used by a classical optimization algorithm to propose new parameters. The empirical average comes with a statistical error ∆E that leads to the so-called measurement problem of VQE( top right): the number N of samples required to reach chemical accuracies( 1mHa) is very large. The energy landscape tends to be very flat for deep enough variational circuits, leading to trainability issues: this is the barren plateau problem( bottom right).
help much without the help of quantum error correction, a concept that Peter Shor borrowed from classical computers in the mid-1990s to fight against decoherence [ 8 ], and that we will touch on later.
CAN WE NEVERTHELESS SALVAGE SOMETHING FROM CURRENT PROCESSORS? This goal is pursued by many researchers and engineers, with efforts to create algorithms that are short enough to beat decoherence, while at the same time overpowering classical processors.
To this aim, an old method, the variational method, was revisited with a quantum twist: to minimize the energy

< ψ( θ ⃗) | H | ψ( θ⃗ of a family

)> of parameterized states, one uses a quantum computer to prepare a state | ψ( θ⃗ and measure its energy, and a

)> classical processor to propose new parameters θ⃗ to reach a minimum of the energy landscape, as illustrated in Fig. 3. If the quantum computer can prepare states | ψ( θ⃗ that are

)> out of the reach of the best classical algorithms and measure their energy with high accuracy, this method, dubbed the variational quantum eigensolver( VQE [ 9 ]), could lead to some practical advantage. VQE, however, comes with two intrinsic limitations( in addition to decoherence): the probabilistic measurement of the energy requires many samples, and the training itself of the variational parameters turns out to be plagued with plateaus and hence exponential slowdowns [ 10 ].

The difficulties of VQE did not prevent“ quantum advantage” claims on current processors. In fact, they were made without using VQE. For instance, the Google company resorted to random quantum circuits— which are known to produce very entangled states with a small amount of quantum gates— to argue they had reached“ quantum supremacy” over classical machines [ 11 ]. Their first claims were rebutted by tensor-network based computations [ 12 ], but the newest generation of processors likely reached the initial goal [ 13 ]. This is however very far from any useful application.
Claims for useful quantum advantage were made by the IBM company in 2023 on a dynamical evolution problem [ 14 ], but they were quickly rebutted by classical computations, some of which were also based on tensor networks [ 15 ]. The relative ease with which classical computations reproduced or surpassed quantum computers can be attributed to the fact that physical systems usually obey constraints( like symmetries, conservation laws) that limit the growth of correlations or entanglement, and therefore make them tractable by classical algorithms, up to a certain point. Currently, the point where classical algorithms cease to work is
Figure 4. Principle of quantum error correction: by grouping several physical qubits into one logical qubit, one makes more noise-robust qubits, provided the physical( individual) error rate is lower than a certain threshold.
70 www. photoniques. com I Photoniques 131