A new semi-supervised classifier based on maximum vector-angular margin. (English) Zbl 1364.65122

Summary: Semi-supervised learning is an attractive method in classification problems when insufficient training information is available. In this investigation,a new semi-supervised classifier is proposed based on the concept of maximum vector-angular margin, (called S\(^3\)MAMC), the main goal of which is to find an optimal vector \(c\) as close as possible to the center of the dataset consisting of both labeled samples and unlabeled samples. This makes S\(^3\)MAMC better generalization with smaller VC (Vapnik-Chervonenkis) dimension. However, S\(^3\)MAMC formulation is a non-convex model and therefore it is difficult to solve. Following that we present two optimization algorithms, mixed integer quadratic program (MIQP) and DC (difference of convex functions) program algorithms, to solve the S\(^3\)MAMC. Compared with the supervised learning methods, numerical experiments on real and synthetic databases demonstrate that the S\(^3\)MAMC can improve generalization when the labelled samples are relatively few. In addition, the S\(^3\)MAMC has competitive experiment results in generalization compared to the traditional semi-supervised classification methods.


65K05 Numerical mathematical programming methods
90C26 Nonconvex programming, global optimization
90C11 Mixed integer programming
78M50 Optimization problems in optics and electromagnetic theory


Full Text: DOI


[1] L. T. H. An, The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems,, Annals of Operations Research, 133, 23, (2005) · Zbl 1116.90122
[2] L. T. H. An, A DC programming approach for feature selection in support vector machines learning,, Advances in Data Analysis and Classification, 2, 259, (2008) · Zbl 1284.90057
[3] A. Asuncion, UCI machine learning repository,, School of Information and Computer Sciences, (2007)
[4] K. Bennett, Semi-supervised support vector machines,, In Advances in Neural Information Processing Systems, 12, 368, (1998)
[5] W. Changzhi, A DC programming approach for sensor network localization with uncertainties in anchor positions,, Journal of Industrial and Management Optimization, 10, 817, (2014) · Zbl 1292.90314
[6] O. Chapelle, Optimization Techniques for Semi-Supervised Support Vector Machines,, Journal of Machine Learning Research, 9, 203, (2008) · Zbl 1225.68158
[7] T. Fawcett, An introduction to ROC analysis,, Pattern Recognition Letters, 27, 861, (2006)
[8] G. Fung, Semi-Supervised Support Vector Machines for Unlabeled Data Classification., Optimization methods & software, 15, 29, (2001) · Zbl 1075.68633
[9] W. Guan, Sparse high-dimensional fractional-norm support vector machine via DC programming,, Computational Statistics and Data Analysis, 67, 136, (2013) · Zbl 1471.62080
[10] W. J. Hu, The Maximum Vector-Angular Margin Classifier and its fast training on large datasets using a core vector machine,, Neural Networks, 27, 60, (2012) · Zbl 1250.68225
[11] P. D. Tao, Convex analysis approaches to DC programming: Theory, algorithms and applications,, Acta Mathematica, 22, 287, (1997)
[12] B. Scholkopf, New support vector algorithms,, Neural Computation, 12, 1207, (2000)
[13] X. Xiao, A sequential convex program method to DC program with joint chance constraints,, Journal of Industrial and Management Optimization, 8, 733, (2012) · Zbl 1288.65089
[14] L. M. Yang, A class of smooth semi-supervised SVM by difference of convex functions programming and algorithm,, Knowledge-Based Systems, 41, 1, (2013)
[15] YALMIP Toolbox., http://control.ee.ethz.ch/ joloef/wiki/pmwiki.php.
[16] Y. B. Yuan, Canonical duality solution for alternating support vector machine · Zbl 1291.90214
[17] V. N. Vapnik, <em>Statistical Learning Theory</em>,, New York: Wiley.1998. · Zbl 0935.62007
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.