Do you have a moment? Bayesian inference using estimating equations via empirical likelihood
Speaker: Professor Howard Bondell, University of Melbourne
Time: 2-3PM Friday 21 Oct
Zoom link at https://uni-sydney.zoom.us/j/89779295453
Bayesian inference typically relies on specification of a likelihood as a key ingredient. Recently, likelihood-free approaches have become popular to avoid specification of potentially intractable likelihoods. Alternatively, in the Frequentist context, estimating equations are a popular choice for inference corresponding to an assumption on a set of moments (or expectations) of the underlying distribution, rather than its exact form. Common examples are in the use of generalised estimating equations with correlated responses, or in the use of M-estimators for robust regression avoiding the distributional assumptions on the errors. In this talk, I will discuss some of the motivation behind empirical likelihood, and how it can be used to incorporate a fully Bayesian analysis into these settings where only specification of moments is desired. This allows one to then take advantage of prior distributions that have been developed to accomplish various shrinkage tasks, both theoretically and in practice. I will further discuss computational issues that arise due to non-convexity of the support of this likelihood and the corresponding posterior, and show how this can be rectified to allow for MCMC and variational approaches to perform posterior inference.
Speaker: Professor Howard Bondell, University of Melbourne
Time: 2-3PM Friday 21 Oct
Zoom link at https://uni-sydney.zoom.us/j/89779295453
Bayesian inference typically relies on specification of a likelihood as a key ingredient. Recently, likelihood-free approaches have become popular to avoid specification of potentially intractable likelihoods. Alternatively, in the Frequentist context, estimating equations are a popular choice for inference corresponding to an assumption on a set of moments (or expectations) of the underlying distribution, rather than its exact form. Common examples are in the use of generalised estimating equations with correlated responses, or in the use of M-estimators for robust regression avoiding the distributional assumptions on the errors. In this talk, I will discuss some of the motivation behind empirical likelihood, and how it can be used to incorporate a fully Bayesian analysis into these settings where only specification of moments is desired. This allows one to then take advantage of prior distributions that have been developed to accomplish various shrinkage tasks, both theoretically and in practice. I will further discuss computational issues that arise due to non-convexity of the support of this likelihood and the corresponding posterior, and show how this can be rectified to allow for MCMC and variational approaches to perform posterior inference.