Statisfaction

Sequential Bayesian inference for time series

Posted in Statistics by Pierre Jacob on 19 May 2015
Bayes factor between two hidden Markov models

Bayes factor between two hidden Markov models, against number of assimilated observations. Values near zero support the simpler model while values larger than one support the more complex model.

Hello hello,

I have just arXived a review article, written for ESAIM: Proceedings and Surveys, called Sequential Bayesian inference for implicit hidden Markov models and current limitations. The topic is sequential Bayesian estimation: you want to perform inference (say, parameter inference, or prediction of future observations), taking into account parameter and model uncertainties, using hidden Markov models. I hope that the article can be useful for some people: I have tried to stay at a general level, but there are more than 90 references if you’re interested in learning more (sorry in advance for not having cited your article on the topic!).  Below I’ll comment on a few points.

(more…)

[Meta-]Blogging as young researchers

Posted in General, Statistics by Pierre Jacob on 11 December 2014

Summary of the article.

Hello all,

This is an article intended for the ISBA bulletin, jointly written by us all at Statisfaction, Rasmus Bååth from Publishable Stuff, Boris Hejblum from Research side effects, Thiago G. Martins from tgmstat@wordpress, Ewan Cameron from Another Astrostatistics Blog and Gregory Gandenberger from gandenberger.org

Inspired by established blogs, such as the popular Statistical Modeling, Causal Inference, and Social Science or Xi’an’s Og, each of us began blogging as a way to diarize our learning adventures, to share bits of R code or LaTeX tips, and to advertise our own papers and projects. Along the way we’ve come to a new appreciation of the world of academic blogging: a never-ending international seminar, attended by renowned scientists and anonymous users alike. Here we share our experiences by weighing the pros and cons of blogging from the point of view of young researchers.

(more…)

Non-negative unbiased estimators

Posted in Statistics by Pierre Jacob on 13 May 2014
Benedict Cumberbatch having to choose between unbiasedness and non-negativity.

Benedict has to choose between unbiasedness and non-negativity.

Hey hey,

With Alexandre Thiéry we’ve been working on non-negative unbiased estimators for a while now. Since I’ve been talking about it at conferences and since we’ve just arXived the second version of the article, it’s time for a blog post. This post is kind of a follow-up of a previous post from July, where I was commenting on Playing Russian Roulette with Intractable Likelihoods by Mark Girolami, Anne-Marie Lyne, Heiko Strathmann, Daniel Simpson, Yves Atchade.

(more…)

Parallel resampling in the particle filter

Posted in Statistics by Pierre Jacob on 12 May 2014
Decisions decisions...

Decisions decisions… which resampling to use on the CPU (left), GPU (middle), or between both (right). The y-axis essentially parametrizes the expected variance of the weights (high values = “we observe an outlier” = high variance of the weights).

Hey there,

It’s been a while I haven’t written about parallelization and GPUs. With colleagues Lawrence Murray and Anthony Lee we have just arXived a new version of Parallel resampling in the particle filter. The setting is that, on modern computing architectures such as GPUs, thousands of operations can be performed in parallel (i.e. simultaneously) and therefore the rest of the calculations that cannot be parallelized quickly becomes the bottleneck. In the case of the particle filter (or any sequential Monte Carlo method such as SMC samplers), that bottleneck is the resampling step. The article investigates this issue and numerically compares different resampling schemes.

(more…)

Moustache target distribution and Wes Anderson

Posted in Art, Geek, R by Pierre Jacob on 31 March 2014

Today I am going to introduce the moustache target distribution (moustarget distribution for brievety). Load some packages first.

library(wesanderson) # on CRAN
library(RShapeTarget) # available on https://github.com/pierrejacob/RShapeTarget/
library(PAWL) # on CRAN

Let’s invoke the moustarget distribution.

 shape <- create_target_from_shape(
file_name=system.file(package = "RShapeTarget", "extdata/moustache.svg"),
lambda=5)
rinit <- function(size) matrix(rnorm(2*size), ncol = 2)
moustarget <- target(name = "moustache", dimension = 2,
rinit = rinit, logdensity = shape$logd,
parameters = shape$algo_parameters)

This defines a target distribution represented by a SVG file using RShapeTarget. The target probability density function is defined on \mathbb{R}^2 and is proportional to 1 on the segments described in the SVG files, and decreases exponentially fast to 0 away from the segments. The density function of the moustarget is plotted below, a picture being worth a thousand words.

(more…)

Beautiful Science: Picturing Data, Inspiring Insight

Posted in General by Pierre Jacob on 19 March 2014

Martin Krzywinski, Circles of Life, 2013 (© Martin Krzywinski)

Hey,

There’s a nice exhibition open until May 26th at the British Library in London, entitled Beautiful Science: Picturing Data, Inspiring Insight. Various examples of data visualizations are shown, either historical or very modern, or even made especially for the exhibition. Definitely worth a detour if you happen to be in the area, you can see everything in 15 minutes.

In particular there are nice visualisations of historical climate data, gathered from the logbooks of the English East India company, whose ships were crossing every possible sea in the beginning of the 19th century. The logbooks contain locations and daily weather reports, handwritten by the captains themselves. Turns out the logbooks are kept at the British Library itself and some of them are on display at the exhibition. More info on that project here: oldweather.org.

Rasmus Bååth’s Bayesian first aid

Posted in Project, R, Statistics by Pierre Jacob on 23 January 2014

Besides having coded a pretty cool MCMC app in Javascript, this guy Rasmus Bååth has started the Bayesian first aid project. The idea is that if there’s an R function called blabla.test performing test “blabla”, there should be a function bayes.blabla.test performing a similar test in a Bayesian framework, and showing the output in a similar way so that the user can easily compare both approaches.This post explains it all. Jags and BEST seem to be the two main workhorses under the hood.

Kudos to Rasmus for this very practical approach, potentially very impactful. Maybe someday people will have to specify if they want a frequentist approach and not the other way around! (I had a dream, etc).

MCM’Ski lessons

Posted in Seminar/Conference by Pierre Jacob on 16 January 2014

A few days after the MCMSki conference, I start to see the main lessons gathered there.

  1. I should really read the full program before attending the next MCMSki. The three parallel sessions looked consistently interesting, and I really regret having missed some talks (in particular Dawn Woodard‘s and Natesh Pillai‘s) and some posters as well (admittedly, due to exhaustion on my part).
  2. Compared to the previous instance three years ago (in Utah), the main themes have significantly changed. Scalability, approximate methods, non-asymptotic results, 1/n methods … these keywords are now on everyone’s lips. Can’t wait to see if MCQMC’14 will feel that different from MCQMC’12.
  3. The community is rightfully concerned about scaling Monte Carlo methods to big data, with some people pointing out that models should also be rethought in this new context.
  4. The place of software developers in the conference, or simply references to software packages in the talks, is much greater than it used to be. It’s a very good sign towards reproducible research in our field. There’s still a lot of work to do, in particular in terms of making parallel computing easier to access (time to advertise LibBi a little bit). On a related note, many people now point out whether their proposed algorithms are parallel-friendly or not.
  5. Going from the Rockies to the Alps, the food drastically changed from cheeseburgers to just melted cheese. Bread could be found but ground beef and Budweiser were reported missing.
  6. It’s fun to have an international conference in your home country, but switching from French to English all the time was confusing.

Back in flooded Oxford now!

Dennis Lindley (1923-2013)

Posted in General by Pierre Jacob on 16 December 2013

Pierre Jacob:

I’ve just heard this sad piece of news. Definitely one of the greatest statisticians of the last 50 years. Wished I’d had met him in person.

Originally posted on Xi'an's Og:

Dennis Lindley most sadly passed away yesterday at the hospital near his home in Somerset. He was one of the founding fathers of our field (of Bayesian statistics), who contributed to formalise Bayesian statistics in a coherent theory. And to make it one with rational decision-making, a perspective missing in Jeffreys’ vision. (His papers figured prominently in the tutorials we gave yesterday for the opening of O’Bayes 250.) At the age of 90, his interest in the topic had not waned away: as his interview with Tony O’Hagan last Spring showed, his passionate arguing for the rationale of the Bayesian approach was still there and alive! The review he wrote of The Black Swan a few years ago also demonstrated he had preserved his ability to see through bogus arguments. (See his scathing “One hardly advances the respect with which statisticians are held in society by making…

View original 142 more words

Singapore –> Oxford

Posted in General by Pierre Jacob on 3 October 2013

Image

A quick post to say that I’m moving from Singapore to Oxford, UK. I will dearly miss my Singaporean colleagues, as well as my morning Laksa and Nasi lemak. I look forward to the skying season though.

I will work for the next two years as a post-doc with Professors Arnaud Doucet and Yee Whye Teh, on sequential Monte Carlo methods for high-dimensional problems.

Former office mate Alex Thiery is still in Singapore and will start blogging here soon, so we’ll still have two continents covered. Still looking for contributors in the other ones!

Follow

Get every new post delivered to your Inbox.

Join 62 other followers

%d bloggers like this: