Abstract
The Dirichlet process was introduced by Ferguson in 1973 to use with Bayesian nonparametric inference problems. A lot of work has been done based on the Dirichlet process, making it the most fundamental prior in Bayesian nonparametric statistics. Since the construction of Dirichlet process involves an infinite number of random variables, simulation-based methods are hard to implement, and various finite approximations for the Dirichlet process have been proposed to solve this problem. In this paper, we construct a new random probability measure called the truncated Poisson–Dirichlet process. It sorts the components of a Dirichlet process in descending order according to their random weights, then makes a truncation to obtain a finite approximation for the distribution of the Dirichlet process. Since the approximation is based on a decreasing sequence of random weights, it has a lower truncation error comparing to the existing methods using stick-breaking process. Then we develop a blocked Gibbs sampler based on Hamiltonian Monte Carlo method to explore the posterior of the truncated Poisson–Dirichlet process. This method is illustrated by the normal mean mixture model and Caron–Fox network model. Numerical implementations are provided to demonstrate the effectiveness and performance of our algorithm.
Original language | English |
---|---|
Article number | 30 |
Pages (from-to) | 1-20 |
Number of pages | 20 |
Journal | Statistics and Computing |
Volume | 33 |
Issue number | 1 |
DOIs | |
Publication status | Published - Feb 2023 |
Keywords
- Bayesian nonparametric hierarchical models
- Dirichlet process
- Gibbs sampling
- Hamiltonian Monte Carlo
- Normal mean mixture models
- Poisson–Dirichlet process
ASJC Scopus subject areas
- Theoretical Computer Science
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Computational Theory and Mathematics