Slice sampling mixture models. (English) Zbl 1256.65006

Summary: We propose a more efficient version of the slice sampler for Dirichlet process mixture models described by S. G. Walker [Commun. Stat., Simulation Comput. 36, No. 1, 45–54 (2007; Zbl 1113.62058)]. This new sampler allows for the fitting of infinite mixture models with a wide-range of prior specifications. To illustrate this flexibility we consider priors defined through infinite sequences of independent positive random variables. Two applications are considered: density estimation using mixture models and hazard function estimation. In each case we show how the slice efficient sampler can be applied to make inference in the models. In the mixture case, two submodels are studied in detail. The first one assumes that the positive random variables are gamma distributed and the second assumes that they are inverse-Gaussian distributed. Both priors have two hyperparameters and we consider their effect on the prior distribution of the number of occupied clusters in a sample. Extensive computational comparisons with alternative “conditional” simulation techniques for mixture models using the standard Dirichlet process prior and our new priors are made. The properties of the new priors are illustrated on a density estimation problem.


65C50 Other computational problems in probability (MSC2010)
65C05 Monte Carlo methods
65C40 Numerical analysis or methods applied to Markov chains
60J22 Computational methods in Markov chains
60G15 Gaussian processes


Zbl 1113.62058
Full Text: DOI Link


[1] Celeux, G., Hurn, M., Robert, C.P.: Computational and inferential difficulties with mixture posterior distributions. J. Am. Stat. Assoc. 95, 957–970 (2000) · Zbl 0999.62020
[2] Devroye, L.: Non-Uniform Random Variate Generation. Springer, New York (1986) · Zbl 0593.65005
[3] Dunson, D.: Kernel local partition processes for functional data. Discussion paper 2008-26, Department of Statistical Science, Duke University (2008)
[4] Escobar, M.D.: Estimating the means of several normal populations by nonparametric estimation of the distribution of the means. Unpublished Ph.D. dissertation, Department of Statistics, Yale University (1988)
[5] Escobar, M.D.: Estimating normal means with a Dirichlet process prior. J. Am. Stat. Assoc. 89, 268–277 (1994) · Zbl 0791.62039
[6] Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. J. Am. Stat. Assoc. 90, 577–588 (1995) · Zbl 0826.62021
[7] Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209–230 (1973) · Zbl 0255.62037
[8] Gilks, W.R., Best, N.G., Tan, K.K.C.: Adaptive rejection Metropolis sampling within Gibbs sampling. Appl. Stat. 44, 455–472 (1995) · Zbl 0893.62110
[9] Green, P.J., Richardson, S.: Modelling heterogeneity with and without the Dirichlet process. Scand. J. Stat. 28, 355–375 (2001) · Zbl 0973.62031
[10] Ishwaran, H., James, L.F.: Gibbs sampling methods for stick-breaking priors. J. Am. Stat. Assoc. 96, 161–173 (2001) · Zbl 1014.62006
[11] Ishwaran, H., Zarepour, M.: Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models. Biometrika 87, 371–390 (2000) · Zbl 0949.62037
[12] Lijoi, A., Mena, R.H., Prünster, I.: Hierarchical mixture modeling with normalized inverse-Gaussian priors. J. Am. Stat. Assoc. 100, 1278–1291 (2005) · Zbl 1117.62386
[13] Lijoi, A., Mena, R.H., Prüenster, I.: Controlling the reinforcement in Bayesian nonparametric mixture models. J. R. Stat. Soc. B 69, 715–740 (2007)
[14] Lo, A.Y.: On a class of Bayesian nonparametric estimates I. Density estimates. Ann. Stat. 12, 351–357 (1984) · Zbl 0557.62036
[15] MacEachern, S.N.: Estimating normal means with a conjugate style Dirichlet process prior. Commun. Stat., Simul. Comput. 23, 727–741 (1994) · Zbl 0825.62053
[16] MacEachern, S.N., Müller, P.: Estimating mixtures of Dirichlet process models. J. Comput. Graph. Stat. 7, 223–238 (1998)
[17] Neal, R.: Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Stat. 9, 249–265 (2000)
[18] Papaspiliopoulos, O.: A note on posterior sampling from Dirichlet mixture models. Preprint (2008)
[19] Papaspiliopoulos, O., Roberts, G.O.: Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika 95, 169–186 (2008) · Zbl 1437.62576
[20] Sethuraman, J.: A constructive definition of Dirichlet priors. Stat. Sin. 4, 639–650 (1994) · Zbl 0823.62007
[21] Sokal, A.: Monte Carlo Methods in Statistical Mechanics: Foundations and New Algorithms Functional Integration, Cargése, 1996. NATO Adv. Sci. Inst. Ser. B Phys., vol. 361, pp. 131–192. Plenum, New York (1997), · Zbl 0890.65006
[22] Smith, A.F.M., Roberts, G.O.: Bayesian computations via the Gibbs sampler and related Markov chain Monte Carlo methods. J. R. Stat. Soc., Ser. B 55, 3–23 (1993) · Zbl 0779.62030
[23] Van Gael, J., Saatchi, Y., Teh, Y.W., Ghahramani, Z.: Beam sampling for the infinite hidden Markov model. Technical Report: Engineering Department, University of Cambridge (2008)
[24] Walker, S.G.: Sampling the Dirichlet mixture model with slices. Commun. Stat., Simul. Comput. 36, 45–54 (2007) · Zbl 1113.62058
[25] Yau, C., Papaspiliopoulos, O., Roberts, G.O., Holmes, C.: Bayesian nonparametric hidden Markov models with application to the analysis of copy-number-variation in mammalian genomes. Technical Report, Man Institute, Oxford (2008)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.