×

Contribution to the theory of Pitman estimators. (English) Zbl 1305.62099

J. Math. Sci., New York 199, No. 2, 202-214 (2014) and Zap. Nauchn. Semin. POMI 408, 245-267 (2012).
Summary: New inequalities are proved for the variance of the Pitman estimators (minimum variance equivariant estimators) of \(\theta\) constructed from samples of fixed size from populations \(F(x - \theta)\). The inequalities are closely related to the classical Stam inequality for the Fisher information, its analog in small samples, and a powerful variance drop inequality. The only condition required is finite variance of \(F\); even the absolute continuity of \(F\) is not assumed. As corollaries of the main inequalities for small samples, one obtains alternate proofs of known properties of the Fisher information, as well as interesting new observations like the fact that the variance of the Pitman estimator based on a sample size \(n\) scaled by \(n\) monotonically decreases in \(n\). Extensions of the results to polynomial versions of the Pitman estimators and a multivariate location parameter are given. Also, the search for the characterization of equality conditions for one of the inequalities leads to a Cauchy-type functional equation for independent random variables, and an interesting new behavior of its solutions is described.

MSC:

62F10 Point estimation
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] S. Artstein, K. M. Ball, F. Barthe, and A. Naor, ”Solution of Shannon’s problem on the monotonicity of entropy,” J. Amer. Math. Soc. (Electronic), 17, 975–982 (2004). · Zbl 1062.94006 · doi:10.1090/S0894-0347-04-00459-X
[2] A. DasGupta, ”Letter to the editors,” IMS Bulletin, 37, 16 (2008).
[3] E. A. Carlen, ”Superadditivity of Fisher’s information and logarithmic Sobolev inequalities,” J. Funct. Anal., 101, 194–211 (1991). · Zbl 0732.60020 · doi:10.1016/0022-1236(91)90155-X
[4] B. Efron and C. Stein, ”The jackknife estimate of variance,” Ann. Statist., 9, 586–596 (1981). · Zbl 0481.62035 · doi:10.1214/aos/1176345462
[5] W. Hoeffding, ” A class of statistics with asymptotically normal distribution,” Ann. Math. Statist., 19, 293–325 (1948). · Zbl 0032.04101 · doi:10.1214/aoms/1177730196
[6] J. Hoffmann-Jørgensen, A. M. Kagan, L. D. Pitt, and L. A. Shepp, ”Strong decomposition of random variables,” J. Theor. Probab., 20, 211–220 (2007). · Zbl 1121.60011 · doi:10.1007/s10959-007-0061-6
[7] I. A. Ibragimov and R. Z. Hasminskii, Statistical Estimation: Asymptotic Theory, Applications of Mathematics, Vol. 16, Springer, New York (1981). · Zbl 0705.62039
[8] A. Kagan and Z. Landsman, ”Statistical meaning of Carlen’s superadditivity of the Fisher’s information,” Statist. Probab. Lett., 32, 175–179 (1997). · Zbl 0874.60002 · doi:10.1016/S0167-7152(96)00070-3
[9] A. M. Kagan and Ya. Malinovsky, ”Monotonicity in the sample size of the length of classical confidence intervals,” Statist. Probab. Lett. (2013) (accepted). · Zbl 1489.62092
[10] A. Kagan, ”An inequality for the Pitman estimators related to the Stam inequality,” Sankhyā, Ser. A, 64, 281–292 (2002). · Zbl 1192.62099
[11] A. M. Kagan, ”On the estimation theory of location parameter,” Sankhyā, Ser. A, 28, 335–352 (1996). · Zbl 0156.39207
[12] A. M. Kagan, ”Fisher information contained in a finite-dimensional linear space, and a properly formulated version of the method of moments,” Probl. Peredačhi Inform., 12, 20–42 (1976). · Zbl 0379.62004
[13] A. M. Kagan, L. B. Klebanov, and S. M. Fintušal, ”Asymptotic behavior of polynomial Pitman estimators,” Zap. Nauchn. Semin. LOMI, 43, 30–39 (1974). · Zbl 0358.62028
[14] E. Lukacs, Characteristic Functions, 2nd ed., Hafner Publishing Co., New York (1970). · Zbl 0201.20404
[15] M. Madiman, A. R. Barron, A. M. Kagan, and T. Yu, ”Fundamental limits for distributed estimation: the case of a location parameter,” Preprint (2009).
[16] M. Madiman, A. R. Barron, A. M. Kagan, and T. Yu, ”A model for pricing data bundles based on minimax risks for estimation of a location parameter,” in: Proc. IEEE Inform. Theory, Workshop Volos, Greece (June 2009).
[17] M. Madiman and A.R. Barron, ”Generalized entropy power inequalities and monotonicity properties of information,” IEEE Trans. Inform. Theory, 53, 2317–2329 (2007). · Zbl 1326.94034 · doi:10.1109/TIT.2007.899484
[18] J. Shao, Mathematical Statistics, 2nd ed., Springer, New York (2003). · Zbl 1018.62001
[19] N.-Z. Shi, ”Letter to the Editors” IMS Bulletin, 36, 4 (2008).
[20] A. J. Stam, ”Some inequalities satisfied by the quantities of information of Fisher and Shannon,” Inform. Control., 2, 101–112 (1959). · Zbl 0085.34701 · doi:10.1016/S0019-9958(59)90348-1
[21] R. Zamir, ”A proof of the Fisher information inequality via a data of processing argument,” IEEE Trans. Inform. Theory, 4, 1246–1250 (1998). · Zbl 0901.62005 · doi:10.1109/18.669301
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.