×

zbMATH — the first resource for mathematics

General maximum likelihood empirical Bayes estimation of normal means. (English) Zbl 1168.62005
Summary: We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than \((\log n)^5/n\). We also prove that the GMLEB is uniformly approximately minimax in regular and weak \(\ell_p\) balls when the order of the length-normalized norm of the unknown means is between \((\log n)^{\kappa _1}/n^{1/(p\land 2)}\) and \(n/(\log n)^{\kappa_2}\). Simulation experiments demonstrate that the GMLEB outperforms the James-Stein and several state-of-the-art threshold estimators in a wide range of settings without much down side.

MSC:
62C12 Empirical decision procedures; empirical Bayes procedures
62G05 Nonparametric estimation
62H12 Estimation in multivariate analysis
62G08 Nonparametric regression and quantile regression
62G20 Asymptotic properties of nonparametric inference
65C60 Computational problems in statistics (MSC2010)
PDF BibTeX XML Cite
Full Text: DOI arXiv
References:
[1] Abramovich, F., Benjamini, Y., Donoho, D. L. and Johnstone, I. M. (2006). Adapting to unknown sparsity by controlling the false discovery rate. Ann. Statist. 34 584-653. · Zbl 1092.62005
[2] Benjamini, Y. and Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. J. Roy. Statist. Soc. Ser. B 57 289-300. JSTOR: · Zbl 0809.62014
[3] Birgé, L. and Massart, P. (2001). Gaussian model selection. J. Eur. Math. Soc. 3 203-268. · Zbl 1037.62001
[4] Borell, C. (1975). The Brunn-Minkowski inequality in Gaussian space. Invent. Math. 30 207-216. · Zbl 0311.60007
[5] Brown, L. D. (1971). Admissible estimators, recurrent diffusions and insoluble boundary value problems. Ann. Math. Statist. 42 855-903. · Zbl 0246.62016
[6] Brown, L. D. and Greenshtein, E. (2007). Empirical Bayes and compound decision approaches for estimation of a high-dimensional vector of normal means. Ann. Statist. · Zbl 1166.62005
[7] Cai, T. T. (2002). On block thresholding in wavelet regression. Statist. Sinica 12 1241-1273. · Zbl 1004.62036
[8] Cai, T. T. and Silverman, B. W. (2001). Incorporating information on neighboring coefficients into wavelet estimation. Sankhyā Ser. B 63 127-148. · Zbl 1192.42020
[9] Carathéodory, C. (1911). Über den variabilitätsbereich der fourierschen konstanten von positiven harmonischen funktionen. Rend. Circ. Mat. Palermo 32 193-217. · JFM 42.0429.01
[10] Cover, T. M. (1984). An algorithm for maximizing expected log investment return. IEEE Trans. Inform. Theory 30 369-373. · Zbl 0541.90007
[11] Dempster, A. P., Laird, N. M. and Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm (with discussion). J. Roy. Statist. Soc. Ser. B 39 1-38. JSTOR: · Zbl 0364.62022
[12] Donoho, D. L. and Johnstone, I. M. (1994a). Minimax risk over \ell p -balls for \ell q -error. Probab. Theory Related Fields 99 277-303. · Zbl 0802.62006
[13] Donoho, D. L. and Johnstone, I. M. (1994b). Ideal spatial adaptation via wavelet shrinkage. Biometrika 81 425-455. JSTOR: · Zbl 0815.62019
[14] Donoho, D. L. and Johnstone, I. M. (1995). Adapting to unknown smoothness via wavelet shrinkage. J. Amer. Statist. Assoc. 90 1200-1224. JSTOR: · Zbl 0869.62024
[15] Efron, B. (2003). Robbins, empirical Bayes and microarrays. Ann. Statist. 31 366-378. · Zbl 1038.62099
[16] Efron, B. and Morris, C. (1972). Empirical Bayes on vector observations: An extension of Stein’s method. Biometrika 59 335-347. JSTOR: · Zbl 0238.62072
[17] Efron, B. and Morris, C. (1973). Stein’s estimation rule and its competitors-an empirical Bayes approach. J. Amer. Statist. Assoc. 68 117-130. JSTOR: · Zbl 0275.62005
[18] Foster, D. P. and George, E. I. (1994). The risk inflation criterion for multiple regression. Ann. Statist. 22 1947-1975. · Zbl 0829.62066
[19] George, E. (1986). Mimimax multiple shrinkage estimation. Ann. Statist. 14 288-305. · Zbl 0602.62041
[20] Ghosal, S. and van der Vaart, A. W. (2001). Entropy and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities. Ann. Statist. 29 1233-1263. · Zbl 1043.62025
[21] Ghosal, S. and van der Vaart, A. W. (2007). Posterior convergence rates for Dirichlet mixtures at smooth densities. Ann. Statist. 35 697-723. · Zbl 1117.62046
[22] James, W. and Stein, C. (1961). Estimation with quadratic loss. In Proc. Fourth Berkeley Symp. Math. Statist. and Prob. 1 361-379. Univ. California Press, Berkeley. · Zbl 1281.62026
[23] Johnstone, I. M. (1994). Minimax Bayes, asymptotic minimax and sparse wavelet priors. In Statistical Decision Theory and Related Topics V (S. Gupta and J. Berger, eds.) 303-326. Springer, New York. · Zbl 0815.62017
[24] Johnstone, I. M. and Silverman, B. W. (2004). Needles and hay in haystacks: Empirical Bayes estimates of possibly sparse sequences. Ann. Statist. 32 1594-1649. · Zbl 1047.62008
[25] Kiefer, J. and Wolfowitz, J. (1956). Consistency of the maximum likelihood estimator in the presence of infinitely many incidental parameters. Ann. Math. Statist. 27 887-906. · Zbl 0073.14701
[26] Morris, C. N. (1983). Parametric empirical Bayes inference: Theory and applications. J. Amer. Statist. Assoc. 78 47-55. JSTOR: · Zbl 0506.62005
[27] Robbins, H. (1951). Asymptotically subminimax solutions of compound statistical decision problems. In Proc. Second Berkeley Symp. Math. Statist. Probab. 1 131-148. Univ. California Press, Berkeley. · Zbl 0044.14803
[28] Robbins, H. (1956). An empirical Bayes approach to statistics. In Proc. Third Berkeley Symp. Math. Statist. Probab. 1 157-163. Univ. California Press, Berkeley. · Zbl 0074.35302
[29] Robbins, H. (1964). The empirical Bayes approach to statistical decision problems. Ann. Math. Statist. 35 1-20. · Zbl 0138.12304
[30] Robbins, H. (1983). Some thoughts on empirical Bayes estimation. Ann. Statist. 11 713-723. · Zbl 0522.62024
[31] Stein, C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate normal distribution. In Proc. Third Berkeley Symp. Math. Statist. Probab. 1 157-163. Univ. California Press, Berkeley. · Zbl 0073.35602
[32] Tang, W. and Zhang, C.-H. (2005). Bayes and empirical Bayes approaches to controlling the false discovery rate. Technical Report 2005-2004, Dept. Statistics and Biostatistics, Rutgers Univ.
[33] Tang, W. and Zhang, C.-H. (2007). Empirical Bayes methods for controlling the false discovery rate with dependent data. In Complex Datasets and Inverse Problems: Tomography, Networks, and Beyond (R. Liu, W. Strawderman and C.-H. Zhang, eds.). Lecture Notes-Monograph Series 54 151-160. IMS, Beachwood, OH.
[34] van der Vaart, A. W. and Wellner, J. A. (1996). Weak Convergence and Empirical Processes . Springer, New York. · Zbl 0862.60002
[35] Vardi, Y. and Lee, D. (1993). From image deblurring to optimal investment: Maximum likelihood solutions for positive linear inverse problem (with discussion). J. Roy. Statist. Soc. Ser. B 55 569-612. JSTOR: · Zbl 0798.62110
[36] Zhang, C.-H. (1997). Empirical Bayes and compound estimation of normal means. Statist. Sinica 7 181-193. · Zbl 0904.62008
[37] Zhang, C.-H. (2003). Compound decision theory and empirical Bayes method. Ann. Statist. 31 379-390. · Zbl 1039.62005
[38] Zhang, C.-H. (2005). General empirical Bayes wavelet methods and exactly adaptive minimax estimation. Ann. Statist. 33 54-100. · Zbl 1064.62009
[39] Zhang, C.-H. (2008). Generalized maximum likelihood estimation of normal mixture densities. Statist. Sinica . · Zbl 1166.62013
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.