×

zbMATH — the first resource for mathematics

An iterative SVM approach to feature selection and classification in high-dimensional datasets. (English) Zbl 1323.68459
Summary: Support vector machine (SVM) is the state-of-the-art classification method, and the doubly regularized SVM (DrSVM) is an important extension based on the elastic net penalty. DrSVM has been successfully applied in handling variable selection while retaining (or discarding) correlated variables. However, it is challenging to solve this model. In this paper we develop an iterative \(\ell _2\)-SVM approach to implement DrSVM over high-dimensional datasets. Our approach can significantly reduce the computation complexity. Moreover, the corresponding algorithms have global convergence property. Empirical results over the simulated and real-world gene datasets are encouraging.

MSC:
68T10 Pattern recognition, speech recognition
68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Rakotomamonjy, A., Variable selection using SVM-based criteria, Journal of Machine Learning Research, 3, 1357-1370, (2003) · Zbl 1102.68583
[2] Y. Grandvalet, S. Ganu, Adaptive scaling for feature selection in SVMs, in: NIPS, vol. 15, 2002.
[3] T. Jebara, T. Jaakkola, Feature selection and dualities in maximum entropy discrimination, in: 16th Annual Conference on Uncertainty in Artificial Intelligence, 2000.
[4] Bach, F.; Jenatton, R.; Mairal, J.; Obozinski, G., Optimization with sparsity-inducing penalties, Foundations and Trends in Machine Learning, 4, 1, 1-106, (2012) · Zbl 06064248
[5] P.S. Bradley, O.L. Mangasarian, Feature selection via concave minimization and support vector machines, in: ICML, 1998, pp. 82-90.
[6] Weston, J.; Elisseeff, A.; Scholkopf, B.; Tipping, M., Use of the zero-norm with linear models and2 kernel methods, Journal of Machine Learning Research, 3, 1439-1461, (2003) · Zbl 1102.68605
[7] J. Zhu, S. Rosset, T. Hastie, R. Tibshirani, 1-norm support vector machines, in: NIPS, vol. 16, 2003.
[8] J. Bi, Y. Chen, J.Z. Wang, A sparse support vector machine approach to region-based image categorization, in: CVPR, 2005, pp. 1121-1128.
[9] Bi, J.; Bennett, K. P.; Embrechts, M.; Breneman, C.; Song, M., Dimensionality reduction via sparse support vector machines, Journal of Machine Learning Research, 3, 1229-1243, (2003) · Zbl 1102.68531
[10] Wang, L.; Zhu, J.; Zou, H., The doubly regularized support vector machine, Statistica Sinica, 16, 2, 589-615, (2006) · Zbl 1126.68070
[11] Zou, H.; Hastie, T., Regularization and variable selection via the elastic net, Journal of Royal Statistical Society B, 301-320, (2005) · Zbl 1069.62054
[12] Jeyakumar, V.; Li, G.; Suthaharan, S., Support vector machine classifiers with uncertain knowledge sets via robust optimization, OptimizationA Journal of Mathematical Programming and Operations Research, 1-18, (2012)
[13] Wang, L.; Zhu, J.; Zou, H., Hybrid huberized support vector machines for microarray classification and gene selection, Bioinformatics, 24, 3, 412-419, (2008)
[14] G.B. Ye, Y. Chen, X. Xie, Efficient variable selection in support vector machines via the alternating direction methods of multipliers, in: AISTATS, 2011.
[15] Golub, G. H.; Loan, C. F.V., Matrix computations, (1996), Johns Hopkins University Press New York
[16] Boyd, S.; Parikh, N.; Chu, E.; Peleato, J.; Eckstein, B., Distributed optimization and statistical learning via the alternating direction method of multipliers, Foundations and Trends in Machine Learning, 3, 1, 1-122, (2011) · Zbl 1229.90122
[17] Rockafellar, R. T., Convex analysis, (1970), Princeton University Press New York · Zbl 0229.90020
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.