Quantum AI can overcome the threat of “Barren Plateaus”

Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers, overcoming threat of ‘barren plateaus’ in optimization problems.

Convolutional neural networks running on quantum computers have sparked a lot of interest because of their ability to analyse quantum data better than traditional computers. While the applicability of artificial neural networks for huge data sets has been limited due to a basic solvability problem known as “barren plateaus,” new research solves that Achilles heel with a rigorous argument that guarantees scalability.

“The way you construct a quantum neural network can lead to a barren plateau—or not,” said Marco Cerezo, coauthor of the paper titled “Absence of Barren Plateaus in Quantum Convolutional Neural Networks,” published recently by a Los Alamos National Laboratory team in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. “We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”

The visual cortex inspired quantum convolutional neural networks as an artificial intelligence (AI) technology. As a result, they use a series of convolutional layers, or filters, interleaved with pooling layers to reduce data dimension while preserving important features.These neural networks can be used to handle a variety of issues, including image recognition and material discovery. Overcoming barren plateaus is key to realizing quantum computers’ full potential in AI applications and establishing their superiority to traditional computers.

Researchers in quantum machine learning had previously studied how to lessen the consequences of barren plateaus, but they lacked a theoretical basis for completely avoiding them, according to Cerezo. The Los Alamos research demonstrates that some quantum neural networks are resistant to barren plateaus.

“With this guarantee in hand, researchers will now be able to sift through quantum-computer data about quantum systems and use that information for studying material properties or discovering new materials, among other applications,” said Patrick Coles, a quantum physicist at Los Alamos and a coauthor of the paper.

Many more applications for quantum AI algorithms will emerge, Coles thinks, as researchers use near-term quantum computers more frequently and generate more and more data—all machine learning programs are data-hungry.