Dino Sejdinovic - Developments at the Interface Between Kernel Embeddings and Gaussian Processes
Date:
December 2, 2021
Author:
Vincent Adam, Hrvoje Stojic
Recent Developments at the Interface Between Kernel Embeddings and Gaussian Processes
Abstract
Reproducing kernel Hilbert spaces (RKHS) provide a powerful framework, termed kernel mean embeddings, for representing probability distributions, enabling nonparametric statistical inference in a variety of applications. I will give an overview of this framework and present some of its recent developments which combine RKHS formalism with Gaussian process modelling. Some recent applications include causal data fusion, where data of different quality needs to be combined in order to estimate the average treatment effect, as well as statistical downscaling using potentially unmatched multi-resolution data.
Notes
References:
S. L. Chau, S. Bouabid, and D. Sejdinovic, Deconditional Downscaling with Gaussian Processes, in Advances in Neural Information Processing Systems (NeurIPS), 2021, forthcoming. https://arxiv.org/pdf/2105.12909.pdf
S. L. Chau, J.-F. Ton, J. Gonzalez, Y. W. Teh, and D. Sejdinovic, BayesIMP: Uncertainty Quantification for Causal Data Fusion, in Advances in Neural Information Processing Systems (NeurIPS), 2021, forthcoming. https://arxiv.org/pdf/2106.03477.pdf
Dino Sejdinovic is an Associate Professor at the Department of Statistics, University of Oxford, a Fellow of Mansfield College, Oxford, and a Turing Fellow of the Alan Turing Institute. His personal website can be found here.