Skip to main content

Seminar 25 June @11am

Quantum Natural Gradient for Variational Bayes


Date: 25 June 2021, Friday

Time: 11:00am - 12:00pm AEDT

Speaker: Anna Lopatnikova (University of Sydney)

Abstract:

In this talk, we start by providing a general introduction to quantum computing.   We then focus on Variational Bayes (VB) -- a critical method in machine learning and statistics, underpinning the recent success of Bayesian deep learning.  Even though VB is efficient and scalable relative to alternative methods, it remains too computationally intensive for many practical applications, particularly in high-dimensional settings.   We propose a quantum-classical algorithm to speed up VB through efficient computation of natural gradient – one of the most promising speedup methods, but too computationally intensive in high-dimensions.   To achieve quantum speedup, we proceed in two steps:  First, we reformulate the problem of natural gradient estimation for VB into a linear problem.  Second, we adapt quantum algorithm building blocks to our problem, demonstrating that the problem meets the quantum algorithms’ stringent requirements.   We demonstrate the method’s power in a classical simulation with the VB variational distribution being a computationally intensive neural network.  At the end of the talk, we highlight potential additional applications of quantum computing, for example to obtain approximate solutions to combinatorial optimization problems (such as in transportation or communication).
 
Link to preprint: https://arxiv.org/abs/2106.05807