Are Gibbs-type priors the most natural generalization of the Dirichlet process?

28 Feb 2015  ·  De Blasi P., Favaro S., Lijoi A., Mena R. H., Pruenster I., Ruggiero M. ·

Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Indeed, many popular nonparametric priors, such as the Dirichlet and the Pitman-Yor process priors, select discrete probability distributions almost surely and, therefore, automatically induce exchangeable random partitions. Here we focus on the family of Gibbs-type priors, a recent and elegant generalization of the Dirichlet and the Pitman-Yor process priors. These random probability measures share properties that are appealing both from a theoretical and an applied point of view: (i) they admit an intuitive characterization in terms of their predictive structure justifying their use in terms of a precise assumption on the learning mechanism; (ii) they stand out in terms of mathematical tractability; (iii) they include several interesting special cases besides the Dirichlet and the Pitman-Yor processes. The goal of our paper is to provide a systematic and unified treatment of Gibbs-type priors and highlight their implications for Bayesian nonparametric inference. We will deal with their distributional properties, the resulting estimators, frequentist asymptotic validation and the construction of time-dependent versions. Applications, mainly concerning hierarchical mixture models and species sampling, will serve to convey the main ideas. The intuition inherent to this class of priors and the neat results that can be deduced for it lead one to wonder whether it actually represents the most natural generalization of the Dirichlet process.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Statistics Theory Methodology Statistics Theory