×

zbMATH — the first resource for mathematics

SCAD-penalized regression in high-dimensional partially linear models. (English) Zbl 1162.62037
Summary: We consider the problem of simultaneous variable selection and estimation in partially linear models with a divergent number of covariates in the linear part, under the assumption that the vector of regression coefficients is sparse. We apply the smoothing clipped absolute deviation (SCAD) penalty to achieve sparsity in the linear part and use polynomial splines to estimate the nonparametric component. Under reasonable conditions, it is shown that consistency in terms of variable selection and estimation can be achieved simultaneously for the linear and nonparametric components. Furthermore, the SCAD-penalized estimators of the nonzero coefficients are shown to have the asymptotic oracle property, in the sense that it is asymptotically normal with the same means and covariances that they would have if the zero coefficients were known in advance. The finite sample behavior of the SCAD-penalized estimators is evaluated with simulations and illustrated with a data set.

MSC:
62G08 Nonparametric regression and quantile regression
62J05 Linear regression; mixed models
62G20 Asymptotic properties of nonparametric inference
62E20 Asymptotic distribution theory in statistics
PDF BibTeX XML Cite
Full Text: DOI arXiv
References:
[1] Berndt, E. R. (1991). The Practice of Econometrics: Classical and Contemporary . Addison-Wesley, Reading, MA.
[2] Chen, H. (1988). Convergence rates for parametric components in a partly linear model. Ann. Statist. 16 136-146. · Zbl 0637.62067 · doi:10.1214/aos/1176350695
[3] Engle, R. F., Granger, C. W., Rice, J. and Weiss, A. (1986). Semiparametric estimates of the relationship between weather and electricity sales. J. Amer. Statist. Assoc. 81 310-320.
[4] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348-1360. JSTOR: · Zbl 1073.62547 · doi:10.1198/016214501753382273 · links.jstor.org
[5] Fan, J. and Li, R. (2004). New estimation and model selection procedures for semiparametric modeling in longitudinal data analysis. J. Amer. Statist. Assoc. 99 710-723. · Zbl 1117.62329 · doi:10.1198/016214504000001060 · masetto.asa.catchword.org
[6] Fan, J. and Peng, H. (2004). Nonconcave penalized likelihood with a diverging number of parameters. Ann. Statist. 32 928-961. · Zbl 1092.62031 · doi:10.1214/009053604000000256
[7] Friedman, J. H. (1991). Multivariate adaptive regression splines. Ann. Statist. 19 1-68. · Zbl 0765.62064 · doi:10.1214/aos/1176347963
[8] Heckman, N. E. (1986). Spline smoothing in a partly linear model. J. R. Stat. Soc. Ser. B Stat. Methodol. 48 244-248. JSTOR: · Zbl 0623.62030 · links.jstor.org
[9] Huang, J. (1999). Efficient estimation of the partially linear additive Cox model. Ann. Statist. 27 1536-1563. · Zbl 0977.62035 · doi:10.1214/aos/1017939141
[10] Huang, J., Horowitz, J. L. and Ma, S. G. (2008). Asymptotic properties of bridge estimators in sparse high-dimensional regression models. Ann. Statist. 36 587-613. · Zbl 1133.62048 · doi:10.1214/009053607000000875
[11] Hunter, D. R. and Li, R. (2005). Variable selection using MM algorithms. Ann. Statist. 33 1617-1642. · Zbl 1078.62028 · doi:10.1214/009053605000000200
[12] Kim, Y., Kim, J. and Kim, Y. (2006). Blockwise sparse regression. Statistica Sinica 16 375-390. · Zbl 1096.62076
[13] Knight, K. and Fu, W. (2000). Asymptotics for lasso-type estimators. Ann. Statist. 28 1356-1378. · Zbl 1105.62357 · doi:10.1214/aos/1015957397
[14] Meinshausen, N. and Buhlmann, P. (2006). High dimensional graphs and variable selection with the Lasso. Ann. Statist. 34 1436-1462. · Zbl 1113.62082 · doi:10.1214/009053606000000281
[15] Robinson, P. M. (1988). Root-n-consistent semiparametric regression. Econometrica 56 931-954. JSTOR: · Zbl 0647.62100 · doi:10.2307/1912705 · links.jstor.org
[16] Schumaker, L. (1981). Spline Functions: Basic Theory . Wiley, New York. · Zbl 0449.41004
[17] Speckman, P. (1988). Kernel smoothing in partial lineal models. J. R. Stat. Soc. Ser. B Stat. Methodol. 50 413-436. JSTOR: · Zbl 0671.62045 · links.jstor.org
[18] Stone, C. J. (1980). Optimal rates of convergence for nonparametric estimators. Ann. Statist. 8 1348-1360. · Zbl 0451.62033 · doi:10.1214/aos/1176345206
[19] Stone, C. J. (1982). Optimal global rates of convergence for nonparametric regression. Ann. Statist. 10 1040-1053. · Zbl 0511.62048 · doi:10.1214/aos/1176345969
[20] Stone, C. J. (1985). Additive regression and other nonparametric models. Ann. Statist. 13 689-705. · Zbl 0605.62065 · doi:10.1214/aos/1176349548
[21] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 58 267-288. JSTOR: · Zbl 0850.62538 · links.jstor.org
[22] van de Geer, S. (2008). High-dimensional generalized linear models and the Lasso. Ann. Statist. 36 614-645. · Zbl 1138.62323 · doi:10.1214/009053607000000929
[23] Wahba, G. (1984). Partial spline models for the semiparametric estimation of functions of several variables. In Statistical Analyses for Time Series, Japan-US Joint Seminar 319-329. Institute of Statistical Mathematics, Tokyo.
[24] Wahba, G. (1990). Spline Models for Observational Data . SIAM, Philadelphia, PA. · Zbl 0813.62001
[25] Wang, S. and Jia, Z. (1993). Inequalities in Matrix Theory . Anhui Education Press, Hefei, China.
[26] Zhang, C.-H. and Huang, J. (2008). The sparsity and bias of the Lasso selection in high-dimensional linear regression. Ann. Statist. 36 1567-1594. · Zbl 1142.62044 · doi:10.1214/07-AOS520
[27] Zhao, P. and Yu, B. (2006). On model selection consistency of LASSO. J. Mach. Learn. Res. 7 2541-2563. · Zbl 1222.62008 · www.jmlr.org
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.