Skip to the content.

Scalable Inference for Entropy-Regularised Variational Objectives

Project ID: 2531bd1707

(You will need this ID for your application)

Research Theme: Mathematical Sciences

Research Area(s): Statistics and applied probability
Artificial intelligence technologies

UCL Lead department: Statistical Science

Department Website

Lead Supervisor: Alexandros Beskos

Project Summary:

Why this research is important

In modern machine learning and scientific computing, predictive performance is an increasingly important consideration. In this context, classical Bayesian posteriors - which are tied to a restrictive likelihood-based updating mechanism and often neglect predictive risk - can be suboptimal. To address this, a host of new posteriors have recently been derived, which can provably address these shortcomings.

What you will be doing

This project will develop new algorithms for posteriors obtained as the optima of entropy-regularised variational objectives. Such distributions are defined as the solutions to infinite-dimensional optimization problems over the space of measures, and thus do not typically admit closed-form solutions or straightforward MCMC sampling.

Instead, new computational strategies are required. These include approaches based on Wasserstein gradient flows, particle-based algorithms, and extensions of variational gradient descent. To make such approaches practically feasible, many challenges need addressing. These include convergence guarantees, principled strategies for hyperparameter fine-tuning, and finite-sample error bounds. This project will focus on deriving such certificates, which will not only inform how to implement existing algorithms, but also the derivation of new approaches.

Who you will be working with

This project will be supervised by Prof Alexandros Beskos, Dr Louis Sharrock and Dr Jeremias Knoblauch, both in the Department of Statistical Science. Dr Sharrock is an expert in the development and analysis of scalable computational methods for modern machine learning, including Wasserstein gradient flows, Markov chain Monte Carlo, particle-based variational inference, and related methods. Dr Knoblauch is an international leader in generalised- and post-Bayesian inference, and leads UCL’s group on the ‘Fundamentals of Statistical Machine Learning’.

Who we are looking for

This project will be well suited to a student with a strong mathematical and statistical background, with an interest in methodological or theoretical questions at the intersection of machine learning, Bayesian statistics, and stochastic analysis.