Statisfaction

Awesome Bristol

Posted in Seminar/Conference by Pierre Jacob on 24 April 2012

Hey,

Last week there was a workshop on Confronting Intractability in Statistical Inference, organised by the University of Bristol and the SuSTain group. It was hosted at the Goldney Hall (picture above). It turned out to be a succession of fascinating talks about the recent developments and the future of statistical methods used in very challenging inference problems. What I appreciated above all was the ambition of many talks, and the generosity of the speakers in giving many ideas to the audience.

Among the things I’ve learned there, the following were the most ambitious in my opinion.

  • Scott Schmidler from Duke gave an introductory presentation of how complexity is defined and analysed in computer science, classifying the problems in terms of the time needed to solve them in the worst case (polynomial or exponential?). The main message there was: don’t spend too much time trying to solve a problem that computer scientists already classified as “impossible”, and use some tools from computer science to analyse the difficulty of statistical inference problems.
  • In another analogy, Christian L. Müller from ETH Zürich gave a talk comparing recent optimization methods (like variable-metric algorithms) with adaptive MCMC; they’re similar methods even though they’re expressed in different terms, and the efficiency assessments are necessarily quite different (sampling from a distribution is not exactly like finding the maximum of its probability density function). In this analogy, he proposed to think about benchmark test suites for sampling problems, just as there are test suites for optimization problems. That could be very nice, since many articles already used very similar problems, e.g. mixture models, banana-shape distributions, etc. as toy examples illustrating multi-modality and ridges in the target distribution.
  • Sebastian Reich from Universität Potsdam gave a talk about the links between sampling problems and optimal transport in measure theory, which are related to Bayesian optimal maps and implicit particle filters. It’s mainly about alternatives to standard Monte Carlo (e.g. importance sampling) when “moving” a sample from distribution f_1 to distribution f_2, relying on deeper arguments connecting two closely-related probability distributions, but still at an experimental level in practice from what I understood. Probably something to keep a careful eye on!
  • Cédric Ginestet from King’s College London presented a poster about Fréchet means, and his results on the topic (see here and here for his articles). The idea is that on a general metric state space some classic results in statistics can be extended (for instance to test the difference between two means); it sounds purely theoretical, but it’s actually driven by many applications: your data can not only consist in numbers but also in images, trees, networks (for instance the data describing patient in a hospital). Then how to define a mean? To what does the mean converge when the sample size grows? How to test the difference between two groups?
  • Geoff Nicholls from the University of Oxford gave a very inspiring talk on variations of the Metropolis–Hastings when the target density function cannot be evaluated point-wise. In this setting, the pseudo-marginal approach applies when one can get unbiased estimates of the density; Nicholls deals with the case where one can get (essentially) normally distributed estimates of the log density (instead of the density itself), relying on recent articles in the physics literature. This could lead to massive improvements in applications where the data set is very large and log likelihood can be written as a sum of independent terms (such as for many regression models). Really looking forward to see the forthcoming paper on this!

I can’t get into too much details since it’s mostly unpublished work, but I just wanted to give an overview for those who couldn’t attend the workshop.

Pierre

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: