×

zbMATH — the first resource for mathematics

An alternating determination-optimization approach for an additive multi-index model. (English) Zbl 1243.62041
Summary: Sufficient dimension reduction techniques are to deal with the curse of dimensionality when the underlying model is of a very general semiparametric multi-index structure and to estimate the central subspace spanned by the indices. However, the cost is that they can only identify the central subspace/central mean subspace and its dimension, rather than the indices themselves. We investigate estimation for an additive multi-index model (AMM) that is of an additive structure with indices. The problem for AMM involves determining and estimating the nonparametric component functions and estimating the corresponding indices in the model. Different from the classical sufficient dimension reduction techniques in the estimation of the subspace and dimensionality determination, we propose a new penalized method to implement the estimation of component functions and of indices simultaneously. To this end, we suggest an alternating determination optimization algorithm to alternatively fit the best model and estimate the indices. Estimation consistency is provided. Simulation studies are carried out to examine the performance of the new method and a real data example is also analysed for illustration.

MSC:
62G05 Nonparametric estimation
62H12 Estimation in multivariate analysis
62G08 Nonparametric regression and quantile regression
65C60 Computational problems in statistics (MSC2010)
Software:
sm
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Bowman, A.W.; Azzalini, A., Applied smoothing techniques for data analysis: the kernel approach with \(S\)-plus illustrations, (1997), Oxford University Press USA · Zbl 0889.62027
[2] Cook, R.D., Regression graphics: ideas for studying regressions through graphics, (1998), John Wiley New York · Zbl 0903.62001
[3] Cook, R.D.; Weisberg, S., Discussion of ‘sliced inverse regression for dimension reduction’, Journal of the American statistical association, 86, 28-33, (1991)
[4] De Boor, C., A practical guide to splines, (1978), Springer-Verlag · Zbl 0406.41003
[5] Fan, J.; Gijbels, I., Local polynomial modelling and its applications, (1997), Chapman and Hall · Zbl 0873.62037
[6] Fan, J.; Li, R., Variable selestion via nonconcave penalized likelihood and its oracle properties, Journal of the American statistical association, 96, 1348-1360, (2001) · Zbl 1073.62547
[7] Friedman, J.H.; Stuetzle, W., Projection pursuit regression, Journal of the American statistical association, 76, 817-823, (1981)
[8] Härdle, W.; Hall, P.; Ichimura, H., Optimal smoothing in single-index models, Annals of statistics, 21, 157-178, (1993) · Zbl 0770.62049
[9] Hastie, T.; Tibshirani, R., Generalized additive models, (1990), Chapman and Hall · Zbl 0747.62061
[10] Huang, J.Z.; Yang, L., Indentification of non-linear additive autoregressive models, Journal of the royal statistical society, series B, 66, 463-477, (2004) · Zbl 1062.62185
[11] Li, K.-C., Sliced inverse regression for dimension reduction, Journal of the American statistical association, 86, 316-327, (1991) · Zbl 0742.62044
[12] Li, B.; Wang, S., On directional regression for dimesnion reduction, Journal of the American statistical association, 102, 997-1008, (2007) · Zbl 05564427
[13] Li, Y.X.; Zhu, L.X., Asymptotics for sliced average variance estimation, Annals of statistics, 35, 41-69, (2007) · Zbl 1114.62053
[14] Lin, C.J., Projected gradient methods for non-negative matrix factorization, Neural computation, 19, 2756-2779, (2007) · Zbl 1173.90583
[15] Peng, H., Wen, S.Q., Zhu, L.X., 2010. Component selection in an additive regression model. Manuscript.
[16] Ruan, L.; Yuan, M., Dimension reduction and parameter estimation for additive index models, Statistics and its interface, 3, 493-500, (2010) · Zbl 1245.62027
[17] Schumaker, L.L., Spline functions: basic theory, (1981), John Wiley & Sons, Inc. New York · Zbl 0449.41004
[18] Tibshirani, R., Regression shrinkage and selection via the lasso, Journal of the royal statistical society, series B, 58, 267-288, (1996) · Zbl 0850.62538
[19] Wang, H.; Leng, C., A note on adaptive group lasso, Computational statistics and data analysis, 52, 5277-5286, (2008) · Zbl 1452.62524
[20] Wang, H.; Li, R.; Tsai, C.L., Turning parameter selectors for the smoothly clipped absolute deviation method, Biometrika, 94, 553-568, (2007) · Zbl 1135.62058
[21] Xia, Y.; Tong, H.; Li, K.W.; Zhu, L.X., An adaptive estimation of dimension reduction space, with discussion, Journal of the royal statistical society, series B, 64, 3, 363-410, (2002) · Zbl 1091.62028
[22] Yuan, M., On the identifiability of additive index models, Statistica sinica, 21, 1901-1911, (2011) · Zbl 1225.62059
[23] Yuan, M.; Lin, Y., Model selection and estimation in regression with grouped variable, Journal of the royal statistical society, series B, 68, 1, 49-67, (2006) · Zbl 1141.62030
[24] Zhao, P.; Rocha, G.; Yu, B., Grouped and hierarchical model selection through composite absolute penalties, Annals of statistics, 37, 6A, 3468-3497, (2009) · Zbl 1369.62164
[25] Zhou, N., Zhu, J., 2007. Group variable selection via a hierarchical lasso and its oracle property. Manuscript. · Zbl 1245.62183
[26] Zhu, L.X.; Fang, K.T., Asymptotics for the kernel estimates of sliced inverse regression, Annals of statistics, 24, 1053-1067, (1996)
[27] Zhu, L.X.; Miao, B.Q.; Peng, H., On sliced inverse regression with high dimensional covariates, Journal of the American statistical association, 101, 630-643, (2006) · Zbl 1119.62331
[28] Zhu, L.X.; Ng, K., Asymptotics for sliced inverse regression, Statistica sinica, 5, 727-736, (1995) · Zbl 0824.62036
[29] Zhu, L.P.; Wang, T.; Zhu, L.X.; Ferré, L., Sufficient dimension reduction through discretization – expectation estimation, Biometrika, 97, 295-304, (2010) · Zbl 1205.62048
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.