×

General Bayesian updating and the loss-likelihood bootstrap. (English) Zbl 1454.62098

The weighted likelihood bootstrap is a method that generates samples from an approximate Bayesian posterior of a parametric model. It is shown in the paper under review that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing expected negative log likelihood under an unknown sampling distribution. This allows extending the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss. The authors call this method the loss-likelihood bootstrap, and make a connection between it and general Bayesian updating (i.e. a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function). Next, the loss-likelihood bootstrap is utilized in the paper to calibrate the general Bayesian posterior by matching asymptotic Fisher information. Finally, the proposed method is illustrated on a number of examples.

MSC:

62F15 Bayesian inference
62F40 Bootstrap, jackknife and other resampling methods
62B10 Statistical aspects of information-theoretic topics
PDFBibTeX XMLCite
Full Text: DOI arXiv