×

zbMATH — the first resource for mathematics

Matrix rank and inertia formulas in the analysis of general linear models. (English) Zbl 1359.15003
Summary: Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs), and how to use the formulas in statistical analysis of GLMs. We first derive analytical expressions of best linear unbiased predictors/best linear unbiased estimators (BLUPs/BLUEs) of all unknown parameters in the model by solving a constrained quadratic matrix-valued function optimization problem, and present some well-known results on ordinary least-squares predictors/ordinary least-squares estimators (OLSPs/OLSEs). We then establish some fundamental rank and inertia formulas for covariance matrices related to BLUPs/BLUEs and OLSPs/OLSEs, and use the formulas to characterize a variety of equalities and inequalities for covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. As applications, we use these equalities and inequalities in the comparison of the covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. The work on the formulations of BLUPs/BLUEs and OLSPs/OLSEs, and their covariance matrices under GLMs provides direct access, as a standard example, to a very simple algebraic treatment of predictors and estimators in linear regression analysis, which leads a deep insight into the linear nature of GLMs and gives an efficient way of summarizing the results.

MSC:
15A03 Vector spaces, linear dependence, rank, lineability
15A09 Theory of matrix inversion and generalized inverses
62H12 Estimation in multivariate analysis
62J05 Linear regression; mixed models
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Markiewicz A., Puntanen S., All about the ⊥ with its applications in the linear statistical models, Open Math., 2015, 13, 33-50 · Zbl 1308.62145
[2] Puntanen S., Styan G.P.H., Isotalo J., Matrix Tricks for Linear Statistical Models: Our Personal Top Twenty, Springer, Berlin Heidelberg, 2011 · Zbl 1291.62014
[3] Rao C.R., Mitra S.K., Generalized Inverse of Matrices and Its Applications, Wiley, New York, 1971
[4] Tian Y., Equalities and inequalities for inertias of Hermitian matrices with applications, Linear Algebra Appl., 2010, 433, 263-296 · Zbl 1205.15033
[5] Tian Y., Some equalities and inequalities for covariance matrices of estimators under linear model, Stat. Papers, 2015, · Zbl 1365.62207
[6] Tian Y., Guo W., On comparison of dispersion matrices of estimators under a constrained linear model, Stat. Methods Appl., 2016, 25, 623-649 · Zbl 1392.62209
[7] Tian Y., Jiang B., Matrix rank/inertia formulas for least-squares solutions with statistical applications, Spec. Matrices, 2016, 4, 130-140 · Zbl 1333.15006
[8] Tian Y., Jiang B., Quadratic properties of least-squares solutions of linear matrix equations with statistical applications, Comput. Statist., · Zbl 1417.15018
[9] Dong B., Guo W., Tian Y., On relations between BLUEs under two transformed linear models, J. Multivariate Anal., 2014, 131, 279-292 · Zbl 1299.62055
[10] Lowerre J.M., Some simplifying results on BLUEs, J. Amer. Stat. Assoc., 1977, 72, 433-437 · Zbl 0369.62085
[11] Rao C.R., A lemma on optimization of matrix function and a review of the unified theory of linear estimation, In: Y. Dodge (ed.), Statistical Data Analysis and Inference, North-Holland, Elsevier, 1989, 397-417 · Zbl 0735.62066
[12] Goldberger A.S., Best linear unbiased prediction in the generalized linear regression models, J. Amer. Stat. Assoc., 1962, 57, 369-375 · Zbl 0124.35502
[13] Marsaglia G., Styan G.P.H., Equalities and inequalities for ranks of matrices, Linear Multilinear Algebra, 1974, 2, 269-292 · Zbl 0297.15003
[14] Tian Y., More on maximal and minimal ranks of Schur complements with applications, Appl. Math. Comput., 2004, 152, 675-692 · Zbl 1077.15005
[15] Penrose R., A generalized inverse for matrices, Proc. Cambridge Phil. Soc., 1955, 51, 406-413 · Zbl 0065.24603
[16] Lange K., Chi E.C., Zhou H., A brief survey of modern optimization for statisticians, Internat. Stat. Rev., 2014, 82, 46-70
[17] Tian Y., A new derivation of BLUPs under random-effects model, Metrika, 2015, 78, 905-918 · Zbl 1329.62264
[18] Rao C.R., Unified theory of linear estimation, Sankhyā Ser. A, 1971, 33, 371-394 · Zbl 0236.62048
[19] Rao C.R., Representations of best linear unbiased estimators in the Gauss-Markoff model with a singular dispersion matrix, J. Multivariate Anal., 1973, 3, 276-292 · Zbl 0276.62068
[20] Rao C.R., Toutenburg H., Shalabh, Heumann C., Linear Models and Generalizations Least Squares and Alternatives, 3rd ed., Springer, Berlin Heidelberg, 2008 · Zbl 1151.62063
[21] Searle S.R., The matrix handling of BLUE and BLUP in the mixed linear model, Linear Algebra Appl., 1997, 264, 291-311 · Zbl 0889.62059
[22] Gan S., Sun Y., Tian Y., Equivalence of predictors under real and over-parameterized linear models, Comm. Stat. Theory Meth., 2016, · Zbl 1369.62118
[23] Tian Y., Jiang, B., A new analysis of the relationships between a general linear model and its mis-specified forms, J. Korean Stat. Soc., 2016, · Zbl 1362.62143
[24] Baksalary J.K., Puntanen S., Characterizations of the best linear unbiased estimator in the general Gauss-Markov model with the use of matrix partial orderings, Linear Algebra Appl., 1990, 127, 363-370 · Zbl 0695.62152
[25] Baksalary J.K., Puntanen S., Styan G.P.H., A property of the dispersion matrix of the best linear unbiased estimator in the general Gauss-Markov model, Sankhyā Ser. A, 1990, 52, 279-296 · Zbl 0727.62072
[26] Isotalo J., Puntanen S., Styan G.P.H., The BLUE’s covariance matrix revisited: A review, J. Stat. Plann. Inference, 2008, 138, 2722-2737 · Zbl 1141.62325
[27] Puntanen S., Styan G.P.H., Tian Y., Three rank formulas associated with the covariance matrices of the BLUE and the OLSE in the general linear model, Econometric Theory, 2005, 21, 659-664 · Zbl 1072.62049
[28] Abdulle A., Wanner G., 200 years of least squares method, Elem. Math., 2002, 57, 45-60 · Zbl 1003.01008
[29] Farebrother R.W., Some early statistical contributions to the theory and practice of linear algebra, Linear Algebra Appl., 1996, 237/238, 205-224 · Zbl 0845.62048
[30] Paris Q., The dual of the least-squares method, Open J. Statist., 2015, 5, 658-664
[31] Stigler S.M., Gauss and the invention of least squares, Ann. Stat., 1981, 9, 465-474 · Zbl 0477.62001
[32] Graybill F.A., An Introduction to Linear Statistical Models, Vol. I, McGraw-Hill, New York, 1961 · Zbl 0121.35605
[33] Searle S.R., Linear Models, Wiley, New York, 1971 · Zbl 0218.62071
[34] Puntanen S., Styan G.P.H., The equality of the ordinary least squares estimator and the best linear unbiased estimator, with comments by O. Kempthorne, S.R. Searle, and a reply by the authors, Amer. Statistican, 1989, 43, 153-164
[35] Alalouf I.S., Styan G.P.H., Characterizations of the conditions for the ordinary least squares estimator to be best linear unbiased, in: Y.P. Chaubey, T.D. Dwivedi (Eds.), Topics in Applied Statistics, Concordia University, Montréal, 1984, 331-344
[36] Baksalary J.K., Criteria for the equality between ordinary least squares and best linear unbiased estimators under certain linear models, Canad. J. Stat., 1988, 16, 97-102 · Zbl 0645.62072
[37] Baksalary J.K., Kala R., An extension of a rank criterion for the least squares estimator to be the best linear unbiased estimator, J. Stat. Plann. Inference, 1977, 1, 309-312 · Zbl 0383.62041
[38] Baksalary J.K., Kala R., Simple least squares estimation versus best linear unbiased prediction, J. Stat. Plann. Inference, 1981, 5, 147-151 · Zbl 0476.62057
[39] Baksalary J.K., van Eijnsbergen A.C., Comparison of two criteria for ordinary-least-squares estimators to be best linear unbiased estimators, Amer. Statistican, 1988, 42, 205-208
[40] Baksalary O.M., Trenkler G., Between OLSE and BLUE, Aust. N.Z.J. Stat., 2011, 53, 289-303 · Zbl 1334.62106
[41] Baksalary O.M., Trenkler G., Liski E.P., Let us do the twist again, Stat. Papers, 2013, 54, 1109-1119 · Zbl 1416.62386
[42] Haslett S.J., Isotalo J., Liu Y., Puntanen S., Equalities between OLSE, BLUE and BLUP in the linear model, Stat. Papers, 2014, 55, 543-561 · Zbl 1334.62110
[43] Haslett S.J., Puntanen S., A note on the equality of the BLUPs for new observations under two linear models, Acta Comment. Univ. Tartu. Math., 2010, 14 27-33 · Zbl 1229.15023
[44] Haslett S.J., Puntanen S., Equality of BLUEs or BLUPs under two linear models using stochastic restrictions, Stat. Papers, 2010, 51, 465-475 · Zbl 1247.62167
[45] Haslett S.J., Puntanen S., On the equality of the BLUPs under two linear mixed models, Metrika, 2011, 74, 381-395 · Zbl 1226.62066
[46] Herzberg A.M., Aleong J., Further conditions on the equivalence of ordinary least squares and weighted least squares estimators with examples, in: J. Lanke, G. Lindgren (Eds.), Contributions to Probability and Statistics in Honour of Gunnar Blom, University of Lund, 1985, 127-142
[47] Isotalo J., Puntanen S., A note on the equality of the OLSE and the BLUE of the parametric functions in the general Gauss-Markov model, Stat. Papers, 2009, 50, 185-193 · Zbl 1309.62113
[48] Jiang B., Sun Y., On the equality of estimators under a general partitioned linear model with parameter restrictions, Stat. Papers, 2016,
[49] Kruskal W., When are Gauss-Markov and least squares estimators identical? A coordinate-free approach, Ann. Math. Statist., 1968, 39, 70-75 · Zbl 0162.21902
[50] Liski E.P., Puntanen S., Wang S., Bounds for the trace of the difference of the covariance matrices of the OLSE and BLUE, Linear Algebra Appl., 1992, 176, 121-130 · Zbl 0753.62033
[51] McElroy F.W., A necessary and sufficient condition that ordinary least-squares estimators be best linear unbiased, J. Amer. Stat. Assoc., 1967, 62, 1302-1304 · Zbl 0153.48102
[52] Milliken G.A., Albohali M., On necessary and sufficient conditions for ordinary least squares estimators to be best linear unbiased estimators, Amer. Statistican, 1984, 38, 298-299
[53] Norlèn U., The covariance matrices for which least squares is best linear unbiased, Scand. J. Statist., 1975, 2, 85-90 · Zbl 0322.62081
[54] Styan G.P.H., When does least squares give the best linear unbiased estimate?, in: D.G. Kabe, R.P. Gupta (Eds.), Multivariate Statistical Inference, North-Holland, Amsterdam, 1973, 241-246
[55] Tian Y., On equalities of estimations of parametric functions under a general linear model and its restricted models, Metrika, 2010, 72, 313-330 · Zbl 1197.62020
[56] Tian Y., On properties of BLUEs under general linear regression models, J. Stat. Plann. Inference, 2013, 143, 771-782 · Zbl 1428.62344
[57] Tian Y., Zhang J., Some equalities for estimations of partial coefficients under a general linear regression model, Stat. Papers, 2011, 52, 911-920 · Zbl 1229.62075
[58] Tian Y., Zhang X., On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Prob. Lett., 2016, 112, 105-112 · Zbl 1341.62133
[59] Abadir K.M., Magnus J.R., Matrix Algebra, Cambridge University Press, 2005
[60] Banerjee S., Roy A., Linear Algebra and Matrix Analysis for Statistics, CRC Press, New York, 2014 · Zbl 1309.15002
[61] Bapat R.B., Linear Algebra and Linear Models, 3rd ed., Springer, Berlin Heidelberg, 2012 · Zbl 0834.62062
[62] Eldén L., Matrix Methods in Data Mining and Pattern Recognition, Society for Industrial and Applied Mathematics (SIAM), Philadelphia, 2007 · Zbl 1120.68092
[63] Fieller N., Basics of Matrix Algebra for Statistics with R, Chapman and Hall/CRC, 2015 · Zbl 1317.15001
[64] Gentle J.E., Numerical Linear Algebra for Applications in Statistics, Springer, Berlin Heidelberg, 1998 · Zbl 0908.65015
[65] Gentle J.E., Matrix Algebra: Theory, Computations, and Applications in Statistics, Springer, Berlin Heidelberg, 2007 · Zbl 1133.15001
[66] Graybill F.A., Matrices with Applications in Statistics, 2nd ed., Brooks/Cole, 2002
[67] Harville D.A., Matrix Algebra From a Statistician’s Perspective, Springer, New York, 1997 · Zbl 0881.15001
[68] Harville D.A., Matrix Algebra: Exercises and Solutions, Springer, New York, 2001 · Zbl 1076.15500
[69] Healy M.J.R., Matrices for Statistics, 2nd ed., Oxford University Press, 2000 · Zbl 0727.62005
[70] Magnus J.R., Neudecker H., Matrix Differential Calculus with Applications in Statistics and Econometrics, Revised edition of the 1988 original, Wiley, New York, 1999 · Zbl 0912.15003
[71] Rao C.R., Rao M.B., Matrix Algebra and Its Applications to Statistics and Econometrics, World Scientific, Singapore, 1998 · Zbl 0915.15001
[72] Schott J.R., Matrix Analysis for Statistics, 2nd ed., Wiley, Hoboken, NJ, 2005 · Zbl 1076.15002
[73] Searle S.R., Matrix Algebra Useful for Statistics, Wiley, New York, 1982 · Zbl 0555.62002
[74] Seber G.A.F., A Matrix Handbook for Statisticians, Wiley, New York, 2008 · Zbl 0141.36602
[75] Lu C., Gan S., Tian Y., Some remarks on general linear model with new regressors, Stat. Prob. Lett., 2015, 97, 16-24 · Zbl 1312.62091
[76] Tian Y., A matrix handling of predictions under a general linear random-effects model with new observations, Electron. J. Linear Algebra, 2015, 29, 30-45 · Zbl 1329.62321
[77] Tian Y., Jiang B., Equalities for estimators of partial parameters under linear model with restrictions, J. Multivariate Anal., 2016, 143, 299-313 · Zbl 1328.62347
[78] Tian Y., Jiang B., An algebraic study of BLUPs under two linear random-effects models with correlated covariance matrices, Linear Multilinear Algebra, 2016, 64, 2351-2367 · Zbl 1358.15012
[79] Zhang X., Tian Y., On decompositions of BLUEs under a partitioned linear model with restrictions, Stat. Papers, 2016, 57, 345-364 · Zbl 1341.62142
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.