Statisfaction

New R user community in Grenoble, France

Posted in R, Seminar/Conference by Julyan Arbel on 13 September 2017

Source: http://www.blacksheep-van.com/fr/ouverture-dune-nouvelle-agence-a-grenoble/

Nine R user communities already exist in France and there is a much large number of R communities around the world. It was time for Grenoble to start its own!

The goal of the R user group is to facilitate the identification of local useRs, to initiate contacts, and to organise experience and knowledge sharing sessions. The group is open to any local useR interested in learning and sharing knowledge about R.

The group’s website features a map and table with members of the R group. Members with specific skills related to the use of R are referenced in a table and can be contacted by other members.  A gitter allows members to discuss R issues and a calendar presents the upcoming events.  (more…)

Advertisements
Tagged with: , ,

ABC in Banff

Posted in General, Seminar/Conference, Statistics by Pierre Jacob on 6 March 2017
c3dhrorvcaaloll

Banff, also known as not the worst location for a scientific meeting.

Hi all,

Last week I attended a wonderful meeting on Approximate Bayesian Computation in Banff, which gathered a nice crowd of ABC users and enthusiasts, including lots of people outside of computational stats, whom I wouldn’t have met otherwise. Christian blogged about it there. My talk on Inference with Wasserstein distances is available as a video here (joint work with Espen Bernton, Mathieu Gerber and Christian Robert, the paper is here). In this post, I’ll summarize a few (personal) points and questions on ABC methods, after recalling the basics of ABC (ahem).

(more…)

momentify R package at BAYSM14

Posted in General, R, Seminar/Conference, Statistics by Julyan Arbel on 20 September 2014

I presented an arxived paper of my postdoc at the big success Young Bayesian Conference in Vienna. The big picture of the talk is simple: there are situations in Bayesian nonparametrics where you don’t know how to sample from the posterior distribution, but you can only compute posterior expectations (so-called marginal methods). So e.g. you cannot provide credible intervals. But sometimes all the moments of the posterior distribution are available as posterior expectations. So morally, you should be able to say more about the posterior distribution than just reporting the posterior mean. To be more specific, we consider a hazard (h) mixture model

\displaystyle h(t)=\int k(t;y)\mu(dy)

where k is a kernel, and the mixing distribution \mu is random and discrete (Bayesian nonparametric approach).

We consider the survival function S which is recovered from the hazard rate h by the transform

\displaystyle S(t)=\exp\Big(-\int_0^t h(s)ds\Big)

and some possibly censored survival data having survival S. Then it turns out that all the posterior moments of the survival curve S(t) evaluated at any time t can be computed.

The nice trick of the paper is to use the representation of a distribution in a [Jacobi polynomial] basis where the coefficients are linear combinations of the moments. So one can sample from [an approximation of] the posterior, and with a posterior sample we can do everything! Including credible intervals.

I’ve wrapped up the few lines of code in an R package called momentify (not on CRAN). With a sequence of moments of a random variable supported on [0,1] as an input, the package does two things:

  • evaluates the approximate density
  • samples from it

A package example for a mixture of beta and 2 to 7 moments gives that result:

mixture

MCM’Ski lessons

Posted in Seminar/Conference by Pierre Jacob on 16 January 2014

A few days after the MCMSki conference, I start to see the main lessons gathered there.

  1. I should really read the full program before attending the next MCMSki. The three parallel sessions looked consistently interesting, and I really regret having missed some talks (in particular Dawn Woodard‘s and Natesh Pillai‘s) and some posters as well (admittedly, due to exhaustion on my part).
  2. Compared to the previous instance three years ago (in Utah), the main themes have significantly changed. Scalability, approximate methods, non-asymptotic results, 1/n methods … these keywords are now on everyone’s lips. Can’t wait to see if MCQMC’14 will feel that different from MCQMC’12.
  3. The community is rightfully concerned about scaling Monte Carlo methods to big data, with some people pointing out that models should also be rethought in this new context.
  4. The place of software developers in the conference, or simply references to software packages in the talks, is much greater than it used to be. It’s a very good sign towards reproducible research in our field. There’s still a lot of work to do, in particular in terms of making parallel computing easier to access (time to advertise LibBi a little bit). On a related note, many people now point out whether their proposed algorithms are parallel-friendly or not.
  5. Going from the Rockies to the Alps, the food drastically changed from cheeseburgers to just melted cheese. Bread could be found but ground beef and Budweiser were reported missing.
  6. It’s fun to have an international conference in your home country, but switching from French to English all the time was confusing.

Back in flooded Oxford now!

Joint Statistical Meeting 2013

Posted in General, Seminar/Conference, Statistics by Pierre Jacob on 23 July 2013
A typical statistical meeting.

A typical statistical meeting.

Hey,

In a few weeks (August 3-8) I’ll attend the Joint Statistical Meeting in Montréal, Canada. According to Wikipedia it’s been held every year since 1840 and now gathers more than 5,000 participants!

I’ll talk in a session organized by Scott Schmidler, entitled Adaptive Monte Carlo Methods for Bayesian Computation; you can find the session programme here [online program]. I’ll talk about score and Fisher observation matrix estimation in state-space models.

According to the rumour and Christian’s reflections on the past years (2009, 2010, 2011), I should prepare my schedule in advance to really enjoy this giant meeting. So if you want to meet there, please send me an e-mail!

See you in Montréal!

Back from ISBA Regional Meeting in India

Posted in Seminar/Conference, Statistics by Pierre Jacob on 15 January 2013

Hello everyone,

and of course Happy New Year (2013 is the international year of statistics!).

Last week the ISBA Regional Meeting was held in Banaras / Varanasi, in the North of India. The conference was well attended, with leading figures such as Jayanta K. Ghosh, José Bernardo, James Berger, Peter Green, Christian Robert who blogged about it, and an overall ~350 participants.

(more…)

A glimps of Inverse Problems

Posted in General, Seminar/Conference, Statistics by JB Salomond on 15 November 2012

Image

Hi folks !

Last Tuesday a seminar on Bayesian procedure for inverse problems took place at CREST. We had time for two presentations of young researchers Bartek Knapik and Kolyan Ray. Both presentations deal with the problem of observing a noisy version of a linear transform of the parameter of interest

Y_i = K\mu + \frac{1}{\sqrt{n}} Z
where K is a linear operator and Z a Gaussian white noise.  Both presentations considered asymptotic properties of the posterior distribution (Their papers can be found on arxiv, here for Bartek’s, and here for Kolyan’s). There is a wide literature on asymptotic properties of the posterior distribution in direc models. When looking at the concentration of f toward a true distribution f_0  given the data, with respect to some distance d(.,.),  well known problem is to derive concentration rates, that is the rate \epsilon_n such that

\pi(d(f,f_0) > \epsilon_n | X^n) \to 0.

For inverse problems, the usual methods as introduced by Ghosal, Ghosh and van der Vaart (2000) usually fails, and thus results in this settings are in general difficult to obtain.

Bartek presented some very refined results in the conjugate case. He manages to get some results on the concentration rates of the posterior distribution, on Bayesian Credible Sets and Bernstein – Von Mises theorems – that states that the posterior is asymptotically Gaussian – when estimating a linear functional of the parameter of interest. Kolyan got some general conditions on the prior to achieve concentration rate, and prove that these techniques leads to optimal concentration rates for classical models.

I only knew little about inverse problems but both talks were very accessible and I will surely get more involved in this field !

Recent Advances in Sequential Monte Carlo / Warwick 2012

Posted in General, Seminar/Conference by Pierre Jacob on 21 September 2012

Hello,

This blog is not dead! And it’s gonna get more active soon.

These last few days, a workshop on Sequential Monte Carlo methods was held in the University of Warwick (link to the webpage). It was a very exciting meeting, efficiently organised by Arnaud Doucet, Adam Johansen, Anthony Lee and Murray Pollock and hosted by CRiSM. For those who couldn’t attend, here’s a little summary of my experience (or more exactly, just a bunch of links). Since SMC methods are at the core of my research, I was logically interested by all the talks (which is exceptional for 3 days of workshop, filled with 30 talks!). It was probably a good time for a workshop on SMC, since there’s a lot of recent activity in the field. My impression is that this renewed interest is mainly due to:

The last point was illustrated at this workshop by a recent work from Alexandre Bouchard-Côté and colleagues called “Entangled Monte Carlo”, as well as by my own presentation: I talked about a new resampling scheme that avoids global interactions between all the particles, and resorts only to multiple pair-wise interactions. This is an on-going work with Pierre Del Moral, Anthony Lee, Lawrence Murray and Gareth Peters, that I might talk about again with more details in the future!

Cheers!

Priors on probability measures

Posted in Seminar/Conference, Statistics by Julyan Arbel on 24 April 2012

Hi,

for the next GTB meeting at Crest, 3rd May, I will present Peter Orbanz‘ work on Projective limit random probabilities on Polish spaces. It will follow my previous presentation about Bayesian nonparametrics on the Dirichlet process.

The article provides a means of constructing any arbitrary prior distribution on the set of probability measures by working on its finite-dimensional marginals. The vanilla example is the Dirichlet process, which is characterized by its Dirichlet distribution marginals on any finite partition of the space (other examples are the Normalized Inverse Gaussian Process and the Pòlya Tree). The figure above illustrates the projective property of the marginals.

Peter will speak at ISBA 2012 Kyoto session : On the uses of random probabilities in Bayesian inference, along with Ramses Mena and Antonio Lijoi. I’ll write more about that later on!

Awesome Bristol

Posted in Seminar/Conference by Pierre Jacob on 24 April 2012

Hey,

Last week there was a workshop on Confronting Intractability in Statistical Inference, organised by the University of Bristol and the SuSTain group. It was hosted at the Goldney Hall (picture above). It turned out to be a succession of fascinating talks about the recent developments and the future of statistical methods used in very challenging inference problems. What I appreciated above all was the ambition of many talks, and the generosity of the speakers in giving many ideas to the audience.

Among the things I’ve learned there, the following were the most ambitious in my opinion.

(more…)

%d bloggers like this: