Generative modeling is a flavour of ML that learns a model of the full process generating your data. This technique is already widely used such as speech synthesis using, for example, Variational Auto-Encoders. However, combining quantum computing with generative modelling is expected to go beyond the possibilities of Deep Learning. A team of researchers wanted to create a hybrid quantum computing approach to performing generative modelling on a trapped ion quantum computer.
The generative model is created by tuning the quantum circuit so that its output matches statistical properties of the original data. This process is done in a loop, repeatedly trying one configuration after another until the output matches the expected data sufficiently well. It is crucial to finish this loop in as few trials as possible due to the expense of running such experiments. The researchers working on this problem reached out to Mind Foundry as they needed a better method for performing this optimisation loop.
Mind Foundry’s Bayesian Optimization technology was able to incorporate all available domain knowledge on the quantum circuit elegantly, and this proved decisive in creating a viable hybrid algorithm for training the generative model efficiently and quickly reaching a result. The results are published in ScienceAdvances.
this is the paragraph font this is the paragraph font this is the paragraph font this is the paragraph font this is the paragraph font this is the paragraph font this is the paragraph font
This demonstration of generative modelling using reconfigurable quantum circuits of up to 26 parameters is one of the most powerful hybrid quantum applications to date.
View all the results in ScienceAdvances.