# zbMATH — the first resource for mathematics

Semi-varying coefficient models with a diverging number of components. (English) Zbl 1216.62060
Summary: Semiparametric models with both nonparametric and parametric components have become increasingly useful in many scientific fields, due to their appropriate representation of the trade-off between flexibility and efficiency of statistical models. We focus on semi-varying coefficient models (a.k.a. varying coefficient partially linear models) in a “large $$n$$, diverging $$p$$” situation, when both the number of parametric and nonparametric components diverges at appropriate rates, and we only consider the case $$p=o(n)$$. Consistency of the estimator based on $$B$$-splines and asymptotic normality of the linear components are established under suitable assumptions. Interestingly (although not surprisingly) our analysis shows that the number of parametric components can diverge at a faster rate than the number of nonparametric components and the divergence rates of the number of the nonparametric components constrain the allowable divergence rates of the parametric components, which is a new phenomenon not established in the existing literature as far as we know. Finally, the finite sample behavior of the estimator is evaluated by some Monte Carlo studies.

##### MSC:
 62G08 Nonparametric regression and quantile regression 62G20 Asymptotic properties of nonparametric inference 65C05 Monte Carlo methods 65D07 Numerical computation using splines
##### Keywords:
$$B$$-spline basis; diverging parameters
hgam
Full Text:
##### References:
 [1] Bai, Z.; Silverstein, J.W., () [2] De Boor, C., A practical guide to splines, (2001), Springer-Verlag New York · Zbl 0987.65015 [3] Fan, J.Q.; Li, R.Z., Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American statistical association, 96, 456, 1348-1360, (2001) · Zbl 1073.62547 [4] Fan, J.Q.; Peng, H., Nonconcave penalized likelihood with a diverging number of parameters, Annals of statistics, 32, 3, 928-961, (2004) · Zbl 1092.62031 [5] Green, P.J.; Silverman, B.W., () [6] Hastie, T.; Tibshirani, R., () [7] Hastie, T.; Tibshirani, R., Varying-coefficient models, Journal of the royal statistical society. series B. statistical methodology, 55, 4, 757-796, (1993) · Zbl 0796.62060 [8] Huang, J.Z., Local asymptotics for polynomial spline regression, Annals of statistics, 31, 5, 1600-1635, (2003) · Zbl 1042.62035 [9] Huang, J.; Horowitz, J.L.; Ma, S.G., Asymptotic properties of bridge estimators in sparse high-dimensional regression models, Annals of statistics, 36, 2, 587-613, (2008) · Zbl 1133.62048 [10] Huang, J.; Horowitz, J.L.; Wei, F., Variable selection in nonparametric additive models, Annals of statistics, 38, 4, 2282-2313, (2010) · Zbl 1202.62051 [11] Huang, J.H.Z.; Wu, C.O.; Zhou, L., Polynomial spline estimation and inference for varying coefficient models with longitudinal data, Statistica sinica, 14, 3, 763-788, (2004) · Zbl 1073.62036 [12] Huber, P.J., Robust regression: asymptotics, conjectures and Monte Carlo, Annals of statistics, 1, 5, 799-821, (1973) · Zbl 0289.62033 [13] Li, R.; Liang, H., Variable selection in semiparametric regression modeling, Annals of statistics, 36, 1, 261-286, (2008) · Zbl 1132.62027 [14] Li, G.; Peng, H.; Zhu, L.X., Nonconcave penalized $$M$$-estimation with diverging number of parameters, Statistica sinica, 21, 391-420, (2011) · Zbl 1206.62036 [15] Meier, L.; Van de Geer, S.; Buhlmann, P., High-dimensional additive modeling, Annals of statistics, 37, 6B, 3779-3821, (2009) · Zbl 1360.62186 [16] Portnoy, S., Asymptotic behavior of $$M$$-estimators of $$p$$ regression parameters when $$p^2 / n$$ is large. I. consistency, Annals of statistics, 12, 4, 1298-1309, (1984) · Zbl 0584.62050 [17] Portnoy, S., Asymptotic behavior of $$M$$-estimators of $$p$$ regression parameters when $$p^2 / n$$ is large. II. normal approximation, Annals of statistics, 13, 4, 1403-1417, (1985) · Zbl 0601.62026 [18] Ravikumar, P.; Liu, H.; Lafferty, J.; Wasserman, L., Spam: sparse additive models, (), 1201-1208 [19] Stone, C., Additive regression and other nonparametric models, Annals of statistics, 13, 2, 689-705, (1985) · Zbl 0605.62065 [20] Tibshirani, R., Regression shrinkage and selection via the lasso, Journal of the royal statistical society. series B. statistical methodology, 58, 1, 267-288, (1996) · Zbl 0850.62538 [21] Wang, L.F.; Li, H.Z.; Huang, J.H.Z., Variable selection in nonparametric varying-coefficient models for analysis of repeated measurements, Journal of the American statistical association, 103, 484, 1556-1569, (2008) · Zbl 1286.62034 [22] Wang, H.S.; Li, B.; Leng, C.L., Shrinkage tuning parameter selection with a diverging number of parameters, Journal of the royal statistical society. series B. statistical methodology, 71, 671-683, (2009) · Zbl 1250.62036 [23] Welsh, A.H., On $$M$$-processes and $$M$$-estimation, Annals of statistics, 17, 1, 337-361, (1989) · Zbl 0701.62074 [24] Xie, H.L.; Huang, J., SCAD-penalized regression in high-dimensional partially linear models, Annals of statistics, 37, 2, 673-696, (2009) · Zbl 1162.62037 [25] Yohai, V.J.; Maronna, R.A., Asymptotic behavior of $$M$$-estimators for the linear model, Annals of statistics, 7, 2, 258-268, (1979) · Zbl 0408.62027 [26] Yuan, M.; Lin, Y., Model selection and estimation in the Gaussian graphical model, Biometrika, 94, 1, 19-35, (2007) · Zbl 1142.62408 [27] Zhang, C.H.; Huang, J., The sparsity and bias of the lasso selection in high-dimensional linear regression, Annals of statistics, 36, 4, 1567-1594, (2008) · Zbl 1142.62044 [28] Zou, H., The adaptive lasso and its oracle properties, Journal of the American statistical association, 101, 476, 1418-1429, (2006) · Zbl 1171.62326 [29] Zou, H.; Li, R.Z., One-step sparse estimates in nonconcave penalized likelihood models, Annals of statistics, 36, 4, 1509-1533, (2008) · Zbl 1142.62027
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.