Sun, Tingni; Zhang, Cun-Hui Scaled sparse linear regression. (English) Zbl 1452.62515 Biometrika 99, No. 4, 879-898 (2012). Summary: Scaled sparse linear regression jointly estimates the regression coefficients and noise level in a linear model. It chooses an equilibrium with a sparse regression method by iteratively estimating the noise level via the mean residual square and scaling the penalty in proportion to the estimated noise level. The iterative algorithm costs little beyond the computation of a path or grid of the sparse regression estimator for penalty levels above a proper threshold. For the scaled lasso, the algorithm is a gradient descent in a convex minimization of a penalized joint loss function for the regression coefficients and noise level. Under mild regularity conditions, we prove that the scaled lasso simultaneously yields an estimator for the noise level and an estimated coefficient vector satisfying certain oracle inequalities for prediction, the estimation of the noise level and the regression coefficients. These inequalities provide sufficient conditions for the consistency and asymptotic normality of the noise-level estimator, including certain cases where the number of variables is of greater order than the sample size. Parallel results are provided for least-squares estimation after model selection by the scaled lasso. Numerical results demonstrate the superior performance of the proposed methods over an earlier proposal of joint convex minimization. Cited in 87 Documents MSC: 62J05 Linear regression; mixed models 62J07 Ridge regression; shrinkage estimators (Lasso) Keywords:convex minimization; estimation after model selection; iterative algorithm; linear regression; oracle inequality; penalized least squares; scale invariance; variance estimation PDF BibTeX XML Cite \textit{T. Sun} and \textit{C.-H. Zhang}, Biometrika 99, No. 4, 879--898 (2012; Zbl 1452.62515) Full Text: DOI arXiv