Skip to main content

Posts

Showing posts from June, 2021

Seminar 25 June @11am

Quantum Natural Gradient for Variational Bayes Date: 25 June 2021, Friday Time: 11:00am - 12:00pm AEDT Speaker: Anna Lopatnikova (University of Sydney) Abstract: In this talk, we start by providing a general introduction to quantum computing.   We then focus on Variational Bayes (VB) -- a critical method in machine learning and statistics, underpinning the recent success of Bayesian deep learning.  Even though VB is efficient and scalable relative to alternative methods, it remains too computationally intensive for many practical applications, particularly in high-dimensional settings.   We propose a quantum-classical algorithm to speed up VB through efficient computation of natural gradient – one of the most promising speedup methods, but too computationally intensive in high-dimensions.   To achieve quantum speedup, we proceed in two steps:  First, we reformulate the problem of natural gradient estimation for VB into a linear problem. ...

Seminar 18 June @10am

Spatial Confounding and Restricted Spatial Regression Methods Date: 18 June 2021, Friday Time: 10am AEDT Speaker: Prof Catherine Calder (University of Texas at Austin) Abstract: Over the last fifteen years, spatial confounding has emerged as a significant source of concern when interpretable inferences on regression coefficients is a primary goal in a spatial regression analysis. Numerous approaches to alleviate spatial confounding have been proposed in the literature, many of which have close connections to dimension reduction techniques used for facilitating faster model fitting. In this presentation, I discuss the issue of spatial confounding in the context of the spatial generalized mixed model for areal data. In particular, I show how many of the techniques for dealing with spatial confounding in this setting can be viewed as a special case of what we refer to as restricted spatial regression (RSR) models. Theoretical characterizations of the posterior distribution of regress...

Seminar 17 June @12pm

Variational Bayes on Manifolds Date: 17 June 2021, Thursday Time: 12pm AEDT Contact the organizer: Andriy Olenko a.olenko@latrobe.edu.au Speaker: A/Prof Minh-Ngoc Tran (University of Sydney) Abstract: Variational Bayes (VB) has become a widely-used tool for Bayesian inference in statistics and machine learning. Nonetheless, the development of the existing VB algorithms is so far generally restricted to the case where the variational parameter space is Euclidean, which hinders the potential broad application of VB methods. This paper extends the scope of VB to the case where the variational parameter space is a Riemannian manifold. We develop an efficient manifold-based VB algorithm that exploits both the geometric structure of the constraint parameter space and the information geometry of the manifold of VB approximating probability distributions. Our algorithm is provably convergent and achieves a decent convergence rate. We develop in particular several manifold VB algorithms includi...

Seminar 10 June @ 10 am

  On hyperparameter tuning in general clustering problems Date :  Thursday, 10 June 2021 Time:   10 -11 am Speaker:  Dr Rachel (Ying) Wang  (The University of Sydney) Abstract: Selecting hyperparameters for unsupervised learning problems is challenging in general due to the lack of ground truth for validation. Despite the prevalence of this issue in statistics and machine learning, especially in clustering problems, there are not many methods for tuning these hyperparameters with theoretical guarantees. In this paper, we provide a framework with provable guarantees for selecting hyperparameters in a number of distinct models. We consider both the subgaussian mixture model and network models to serve as examples of iid and non-iid data. We demonstrate that the same framework can be used to choose the Lagrange multipliers of penalty terms in semi-definite programming (SDP) relaxations for community detection, and the bandwidth parameter for constructing kernel sim...