Blog

 

Expand your horizons with AI.

 

SUBSCRIBE FOR UPDATES 1

 

Quantum computers have the potential to be exponentially faster than traditional computers, revolutionising the way we currently work. While we are still years away from general-purpose Quantum Computing, Bayesian Optimization can help to stabilise quantum circuits for certain applications. This blog will summarise how Mind Foundry Optimize did just that.

Further details feature in this paper, which was submitted to ScienceThe team was composed of researchers from the University of Maryland, UCL, Cambridge Quantum Computing, Mind Foundry, Central Connecticut State University and IonQ.

Do you want to learn more about Bayesian Optimization? Click the link below to download the case study.

View the case study

 

The task

The researchers behind the paper were applying a hybrid quantum learning scheme on a trapped ion quantum computer to accomplish a generative modelling task. Generative models aim to learn representations of data in order to make subsequent tasks easier.

Hybrid quantum algorithms use both classical and quantum resources to solve potentially difficult problems. The Bars-and-Stripes (BAS) data-set was used in the study as it can be easily visualised in terms of images containing horizontal bars and vertical stripes where each pixel represents a qubit.

The experiment was performed on four qubits within a seven-qubit fully programmable trapped ion quantum computer. The quantum circuits are structured as layers of parameterised gates which will be calibrated by the optimization algorithms. The following figure taken from the paper illustrates the set up.

 

Quantum computing experiment set-up

 

Training the quantum circuit

The researchers used two optimization methods in the paper for the training algorithm:

  • Particle Swarm Optimization (PSO): a stochastic scheme that works by creating many “particles” randomly distributed across that explore the landscape collaboratively
  • Bayesian Optimization with Mind Foundry Optimize: a global optimization paradigm that can handle the expensive sampling of many-parameter functions by building and updating a surrogate model of the underlying objective function.

The optimization process consists in simulating the training procedure for a classical simulator in place of the quantum processor for a given set of parameters.

Once the optimal parameters have been identified, the training procedure is then run on the ion quantum computer in Figure 1. The cost functions used to quantify the difference between the BAS distribution and the experimental measurements of the circuit are variants of the original Kullback-Leibler Divergence and are detailed in the paper.

 

Results and outlook

The training results for PSO and Mind Foundry Optimize are provided in the following figures:

Calibration curves using PSO

 

Quantum circuit training results with PSO

 

Calibration curves using Mind Foundry Optimize

 

Quantum circuit training results with Mind Foundry Optimize

Do you want to learn more about Bayesian Optimization? Click the link below to download the case study.

View the case study

 

Optimization results

The simulations are in orange and the ion quantum computer results are in blue. Column (a) corresponds to a circuit with two layers of gates and all-to-all connectivity. Columns (b) and (c ) correspond to a circuit with two and four layers and start connectivity, respectively. (a), (b) and (c ) have 14, 11 and 26 tuneable parameters respectively.

We observe that the circuit is able to converge well to produce the BAS distribution only for the first circuit with PSO whereas with MFOptimize all circuits are able converge. According to the researchers, the success of MFOptimize on the 26 parameter circuit represents the most powerful hybrid quantum application to date.

If you would like to understand the work in more detail, please read the paper.

Additional information on Mind Foundry's work on this use case, and others by Mind Foundry, can be found here.

If you or your company are interested in API access to MFOptimize and other powerful capabilities of the Mind Foundry Platform please contact us or schedule a demo.

Do you want to learn more about Bayesian Optimization? Click the link below to download the case study.

 

Download the report on Bayesian Optimization

Dr. Alessandra Tosi

Written by Dr. Alessandra Tosi

Alessandra is a senior scientist and product owner, with a PhD in probabilistic machine learning and on a mission to push the boundaries of AI through the integration of software solutions and original research.

MindFoundryLogo-Color@4x-1

 

mind-foundry-logo-black
NEW-OSI-Landing-Logo-1

NEW-OSI-Landing-Logo-1200

 

MindFoundryLogo-Color@4x-1

 

mind-foundry-logo-black
NEW-OSI-Landing-Logo-1

NEW-OSI-Landing-Logo-1200

 

Upcoming Events

 

17 sep

 

25 sep

 

14 sep update

 

Human in the loop.

Sign up to get notified next time we publish.