Bayesian workshop in Grenoble, September 6-7

Posted in General, Seminar/Conference by Julyan Arbel on 23 May 2018


We are organising a two-day Bayesian workshop in Grenoble in September 6-7, 2018. It will be the second edition of the Italian-French statistics seminar (link to first edition), titled this year: Bayesian learning theory for complex data modeling. The workshop will give to young statisticians the opportunity to learn from and interact with highly qualified senior researchers in probability, theoretical and applied statistics, with a particular focus on Bayesian methods.

Anyone interested in this field is welcome. There will be two junior sessions and a poster session with a call for abstract open until June 30. A particular focus will be given to researchers in the early stage of their career, or currently studying for a PhD, MSc or BSc. The junior session is supported by ISBA through travel awards.

There will be a social dinner on September 6, and a hike organised in the mountains on September 8.

Confirmed invited speakers

• Simon Barthelmé, Gipsa-lab, Grenoble, France
• Arnoldo Frigessi, University of Oslo, Norway
• Benjamin Guedj, Inria Lille – Nord Europe, France
• Alessandra Guglielmi, Politecnico di Milano, Italy
• Antonio Lijoi, University Bocconi, Milan, Italy
• Bernardo Nipoti, Trinity College Dublin, Ireland
• Sonia Petrone, University Bocconi, Milan, Italy

Important Dates:

• June 30, 2018: Abstract submission closes
• July 20, 2018: Notification on abstract acceptance
• August 25, 2018: Registration closes

More details and how to register:

We look forward to seeing you in Grenoble.



Tagged with: , ,

AI in Grenoble, 2nd to 6th July 2018

Posted in General, Seminar/Conference by Julyan Arbel on 22 March 2018

visuel PAISS T2

This is an advertisement for on conference on AI organised at Inria Grenoble by Thoth team and Naver labs : This AI summer school comprises lectures and practical sessions conducted by renowned experts in different areas of artificial intelligence.

This event is the revival of a past series of very successful summer schools which took place in Grenoble and Paris. The latest edition of this series was held in 2013. While originally focusing on computer vision, the summer school now targets a broader AI audience, and will also include presentations about machine learning, natural language processing, robotics, and cognitive science.

Note that NAVER LABS is funding a number of students to attend PAISS. Apply before 4th April. (more…)

Tagged with: , , ,

momentify R package at BAYSM14

Posted in General, R, Seminar/Conference, Statistics by Julyan Arbel on 20 September 2014

I presented an arxived paper of my postdoc at the big success Young Bayesian Conference in Vienna. The big picture of the talk is simple: there are situations in Bayesian nonparametrics where you don’t know how to sample from the posterior distribution, but you can only compute posterior expectations (so-called marginal methods). So e.g. you cannot provide credible intervals. But sometimes all the moments of the posterior distribution are available as posterior expectations. So morally, you should be able to say more about the posterior distribution than just reporting the posterior mean. To be more specific, we consider a hazard (h) mixture model

\displaystyle h(t)=\int k(t;y)\mu(dy)

where k is a kernel, and the mixing distribution \mu is random and discrete (Bayesian nonparametric approach).

We consider the survival function S which is recovered from the hazard rate h by the transform

\displaystyle S(t)=\exp\Big(-\int_0^t h(s)ds\Big)

and some possibly censored survival data having survival S. Then it turns out that all the posterior moments of the survival curve S(t) evaluated at any time t can be computed.

The nice trick of the paper is to use the representation of a distribution in a [Jacobi polynomial] basis where the coefficients are linear combinations of the moments. So one can sample from [an approximation of] the posterior, and with a posterior sample we can do everything! Including credible intervals.

I’ve wrapped up the few lines of code in an R package called momentify (not on CRAN). With a sequence of moments of a random variable supported on [0,1] as an input, the package does two things:

  • evaluates the approximate density
  • samples from it

A package example for a mixture of beta and 2 to 7 moments gives that result:


YES Workshop in Eindhoven

Posted in Seminar/Conference by Julyan Arbel on 9 November 2010

Hi there,
this week I am attending a workshop in Eindhoven on Bayesian nonparametrics, YES IV. It is nice how much it matches my research interests, going basically from asymptotics (rates of convergence of posterior distributions and of posterior risks) to the use of stochastic processes for defining priors (among others, Dirichlet, Beta, Neutral to right processes). So I find it very stimulating.

I thank Ismaël Castillo, one of the two “local” organisers (along with Bas Kleijn), for having offered to me the possibility to present my submitted paper on convergence rates. Echoing Pierre’s last post, it took me a lot of “downs” to write the paper. It was like I would never make it with the neverending technicalities it involved. And then preparing the presentation was a lot of pressure. Unexpectedly, the talk yesterday was OK, since I was able to i) understand the questions and ii) answer most of them. Afterwards, it is super nice to get the feedback of specialists of the field, like Harry van Zanten, Bas Kleijn and Eduard Belitser.

For people interested in it, here are my slides:

Séminaire Parisien de statistique – October

Posted in Seminar/Conference by Julyan Arbel on 28 October 2010

Last  Monday I attended my first Séminaire Parisien de statistique this year. It takes place at Institut Henri Poincaré, whose famous Director is one of the 2010 Fields Medalists, Cédric Villani.

To quote Valencia discussant favorite adjectives, the three talks were terrific and thought-provoking.

Eric Moulines presented Multiple Try Methods in MCMC algorithms. Instead of a unique proposal at each step, one proposes several values, among which only one is kept. The proposals can be chosen independant, but introducing dependence in the right way speeds up the rate of convergence to the stationary distribution. An interesting feature of this algorithm, espacially for Pierre, is that it allows parallel computation (in multiple propositions) whereas the standard Metropolis-Hastings algorithm is essentially sequential. See as well Pierre, Christian and Murray Smith’s block Independent Metropolis-Hastings algorithm for further details.

Jean-Marc Bardet introduced a way to detect ruptures in time series. He focuses on causal time series, ie they can be written only in terms of present and past innovations, for example AR(\infty). A rupture at time t means the parameters change at t.

The must-see talk for me was Eric Barat presentation on BNP modeling for sapce-time emission tomography. For new comer, BNP means more than a bank: Bayesian nonparametric. It is nice to see a very efficient application of BNP methods to a medical field. Eric kindly gives his slides (cf below) which I recommend, espacially the section on random probability measures: he reviews properties of the Dirichlet process, various representations (Chinese restaurant, Stick-breaking), and extends to the Pitman-Yor process and Pitman-Yor mixture. Then he gives posterior simulations by Gibbs sampling. I am interested in dependent over time models, and I am thankful for Eric for his pointer to a recent article of Chung and Dunson on local Dirichlet process, a nifty and simple  construction of a Dependent Dirichlet process.

In a few days, I will try to make clear what the Dirichlet process is!

NP Bayes slides

Posted in Seminar/Conference by Julyan Arbel on 1 September 2010

I attended in late August a conference on Bayesian nonparametric statistical methods in Santa Cruz, California. The main lecturer was Dr. Peter Muller. He featured the following topics

• Dirichlet processes and the Chinese restaurant process
• Dirichlet process mixture models
• Polya trees
• Dependent Dirichlet processes
• Species sampling models
• Product partition models
• Beta processes and the Indian buffet process
• Computational tools

If you are interested, the slides of his ten talks can be found here, or here for a zip file.

%d bloggers like this: