Statistical inference on MCMC traces

Hi everyone, and Happy New Year! This post is about some statistical inferences that one can do using as “data” the output of MCMC algorithms. Consider the trace plot above. It has been generated by Metropolis–Hastings using a Normal random walk proposal, with a standard deviation “sigma”, on a certain target. Suppose that you areContinue reading “Statistical inference on MCMC traces”

French Habilitation: Bayesian statistical learning and applications

Long time no see, Statisfaction! I’m glad to write about my habilitation entitled Bayesian statistical learning and applications I defended yesterday at Inria Grenoble. This Habilitation à Diriger des Recherches (HDR) is the highest degree issued through a university examination in France. If I am to believe official texts, the HDR recognizes a candidate’s “highContinue reading “French Habilitation: Bayesian statistical learning and applications”

JRSS: Series B read paper and comparison with other unbiased estimators

Hi all, The paper “unbiased Markov chain Monte Carlo with couplings” co-written with John O’Leary and Yves Atchadé has been accepted as a read paper in JRSS: Series B, to be presented on December 11 at the Royal Statistical Society. Comments can be submitted (400 words max) until two weeks after, that is December 28;Continue reading “JRSS: Series B read paper and comparison with other unbiased estimators”

BayesBag, and how to approximate it

Hi all, This post describes how unbiased MCMC can help in approximating expectations with respect to “BayesBag”, an alternative to standard posterior distributions mentioned in Peter Bühlmann‘s discussion of Big Bayes Stories (which was a special issue of Statistical Science). Essentially BayesBag is the result of “bagging” applied to “Bayesian inference”. In passing, here is an RContinue reading “BayesBag, and how to approximate it”

Coding algorithms in R for models written in Stan

Hi all, On top of recommending the excellent autobiography of Stanislaw Ulam, this post is about using the software Stan, but not directly to perform inference, instead to obtain R functions to evaluate a target’s probability density function and its gradient. With which, one can implement custom methods, while still benefiting from the great workContinue reading “Coding algorithms in R for models written in Stan”

Estimating convergence of Markov chains

Hi all, Niloy Biswas (PhD student at Harvard) and I have recently arXived a manuscript on the assessment of MCMC convergence (using couplings!). Here I’ll describe the main result, and some experiments (that are not in the current version of the paper) revisiting a 1996 paper by Mary Kathryn Cowles and Jeff Rosenthal entitled “A simulationContinue reading “Estimating convergence of Markov chains”

particles

I have released a few months ago a Python package for particle filtering, called particles; you can find it on Github here. You may want to have a look first at the documentation, in particular the tutorials here. This package has been developed to support our (with Omiros Papaspiliopoulos) forthcoming book called (tentatively): an introductionContinue reading “particles”

Budget constrained simulations

Hi all, This post is about some results from “Bias Properties of Budget Constrained Simulations“, by Glynn & Heidelberger and published in Operations Research in 1990. I have found these results extremely useful, and our latest manuscript on unbiased MCMC recalls them in detail. Below I go through some of the results and describe the simulationsContinue reading “Budget constrained simulations”

Another take on the Hyvärinen score for model comparison

Exact log-Bayes factors (log-BF) and H-factors (HF) of M1 against M2, computed for 100 independent samples (thin solid lines) of 1000 observations generated as i.i.d. N(1,1), under three increasingly vague priors for θ1. In a former post, Pierre wrote about Bayesian model comparison and the limitations of Bayes factors in the presence of vague priors.Continue reading “Another take on the Hyvärinen score for model comparison”

Final update on unbiased smoothing

Hi, Two years ago I blogged about couplings of conditional particle filters for smoothing.  The paper with Fredrik Lindsten and Thomas Schön has just been accepted for publication at JASA, and the arXiv version and github repository are hopefully in their final forms. Here I’ll mention a few recent developments and follow-up articles by other researchers.