Logo

Forecasting AI blog

Blog

>

Machine Learning

>

FSU researchers improve quantum machine learning algorithms

FSU researchers improve quantum machine learning algorithms

By G.H.

|

March 8, 2021

|

Machine Learning

|

Research by Florida State University professors could help quantum computing deliver on its promise as a powerful computing tool.


William Oates, Cummins Inc. Professor of Mechanical Engineering and Chair of the Department of Mechanical Engineering at FAMU-FSU College of Engineering, and graduate student Guangyi Xu have found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning.

The game of differences between the bit and the qubit.
Their results have been published in scientific reports.

This work could help create artificial neural networks that could be used to train computers to solve complex, interrelated problems such as pattern recognition, drug discovery and the creation of new materials.

"There is a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it to certain applications is a big question," Oates said.


Quantum bits, unlike the binary bits in a standard computer, can exist in multiple states at the same time, a concept known as superposition. Measuring the state of a quantum bit - or qubit - causes it to lose that particular state, so quantum computers work by calculating the probability of a qubit state before it is observed.

Specialized quantum computers, known as quantum annealers, are one tool to perform this type of computation. They work by representing each state of the qubit as an energy level. The lowest energy state among its qubits gives a solution to the problem. The result is a machine that can handle complex, interconnected systems, which would take a very long time for a normal computer to compute -- to build a neural network, for example.

Illustration of a quantum computer.
One way to build neural networks is to use a constrained Boltzmann machine, an algorithm that uses probability learning based on the inputs given to the network. Oates and Xu found a way to automatically calculate an important parameter related to the effective temperature used in this algorithm. Limited Boltzmann machines tend to guess this parameter, which requires testing to confirm and can change whenever the computer is asked to investigate a new problem.

"This parameter in the model replicates what quantum annealing does," Oates said. "If you can accurately estimate it, you can more effectively train your neural network and use it to predict things."

This research was supported by Cummins Inc. and used the resources of the Oak Ridge Leadership Computing Facility, which is the DOE Office of Science User Facility.