×

zbMATH — the first resource for mathematics

The benefit of group sparsity. (English) Zbl 1202.62052
Summary: This paper develops a theory for group Lasso using a concept called strong group sparsity. Our result shows that group Lasso is superior to standard Lasso for strongly group-sparse signals. This provides a convincing theoretical justification for using group sparse regularization when the underlying group structure is consistent with the data. Moreover, the theory predicts some limitations of the group Lasso formulation that are confirmed by simulation studies.

MSC:
62G08 Nonparametric regression and quantile regression
62G05 Nonparametric estimation
62J05 Linear regression; mixed models
65C60 Computational problems in statistics (MSC2010)
PDF BibTeX XML Cite
Full Text: DOI arXiv
References:
[1] Bach, F. R. (2008). Consistency of the group lasso and multiple kernel learning. J. Mach. Learn. Res. 9 1179-1225. · Zbl 1225.68147 · www.jmlr.org
[2] Bickel, P., Ritov, Y. and Tsybakov, A. (2009). Simultaneous analysis of Lasso and Dantzig selector. Ann. Statist. 37 1705-1732. · Zbl 1173.62022 · doi:10.1214/08-AOS620
[3] Candes, E. J. and Tao, T. (2005). Decoding by linear programming. IEEE Trans. Inform. Theory 51 4203-4215. · Zbl 1264.94121 · doi:10.1109/TIT.2005.858979
[4] Ji, S., Dunson, D. and Carin, L. (2009). Multi-task compressive sensing. IEEE Trans. Signal Process. 57 92-106. · Zbl 1391.94258
[5] Koltchinskii, V. and Yuan, M. (2008). Sparse recovery in large ensembles of kernel machines. In COLT’08 . Omnipress, Madison, WI.
[6] Lounici, K., Pontil, M., Tsybakov, A. B. and van de Geer, S. A. (2009). Taking advantage of sparsity in multi-task learning. In COLT’09 . Omnipress, Madison, WI. · Zbl 1177.62001 · doi:10.4171/OWR/2009/16 · www.ems-ph.org
[7] Nardi, Y. and Rinaldo, A. (2008). On the asymptotic properties of the group lasso estimator for linear models. Electron. J. Stat. 2 605-633. · Zbl 1320.62167 · doi:10.1214/08-EJS200
[8] Obozinski, G., Wainwright, M. J. and Jordan, M. I. (2008). Union support recovery in high-dimensional multivariate regression. Technical Report 761, Univ. California Press, Berkeley, CA. · Zbl 1373.62372
[9] Pisier, G. (1989). The volume of convex bodies and Banach space geometry. Cambridge Univ. Press, Cambridge. · Zbl 0698.46008
[10] Rauhut, H., Schnass, K. and Vandergheynst, P. (2008). Compressed sensing and redundant dictionaries. IEEE Trans. Inform. Theory 54 2210-2219. · Zbl 1332.94022 · doi:10.1109/TIT.2008.920190
[11] Stojnic, M., Parvaresh, F. and Hassibi, B. (2009). On the reconstruction of block-sparse signals with an optimal number of measurements. Trans. Signal. Process. 57 3075-3085. · Zbl 1391.94402
[12] Wipf, D. and Rao, B. (2007). An empirical Bayesian strategy for solving the simultaneous sparse approximation problem. IEEE Trans. Signal Process. 55 3704-3716. · Zbl 1391.62010 · doi:10.1109/TSP.2007.894265
[13] Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B Stat. Methodol. 68 49-67. · Zbl 1141.62030 · doi:10.1111/j.1467-9868.2005.00532.x
[14] Zhang, C.-H. and Huang, J. (2008). The sparsity and bias of the lasso selection in high-dimensional linear regression. Ann. Statist. 36 1567-1594. · Zbl 1142.62044 · doi:10.1214/07-AOS520
[15] Zhang, T. (2006). Information theoretical upper and lower bounds for statistical estimation. IEEE Trans. Inform. Theory 52 1307-1321. · Zbl 1320.94033 · doi:10.1109/TIT.2005.864439
[16] Zhang, T. (2009). Some sharp performance bounds for least squares regression with l 1 regularization. Ann. Statist. 37 2109-2144. · Zbl 1173.62029 · doi:10.1214/08-AOS659
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.