×

Robust kernel principal component analysis and classification. (English) Zbl 1284.62370

Summary: Kernel principal component analysis (KPCA) extends linear PCA from a real vector space to any high dimensional kernel feature space. The sensitivity of linear PCA to outliers is well-known and various robust alternatives have been proposed in the literature. For KPCA such robust versions received considerably less attention. In this article we present kernel versions of three robust PCA algorithms: spherical PCA, projection pursuit and ROBPCA. These robust KPCA algorithms are analyzed in a classification context applying discriminant analysis on the KPCA scores. The performances of the different robust KPCA algorithms are studied in a simulation study comparing misclassification percentages, both on clean and contaminated data. An outlier map is constructed to visualize outliers in such classification problems. A real life example from protein classification illustrates the usefulness of robust KPCA and its corresponding outlier map.

MSC:

62H30 Classification and discrimination; cluster analysis (statistical aspects)
62G35 Nonparametric robustness
62H25 Factor analysis and principal components; correspondence analysis
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Alzate C, Suykens JAK (2008) Kernel component analysis using an epsilon-insensitive robust loss function. IEEE Trans Neural Netw 19: 1583–1598
[2] Croux C, Ruiz-Gazen A (1996) A fast algorithm for robust principal components based on projection pursuit. In: COMPSTAT: Proceedings in computational statistics, pp 211–216 · Zbl 0900.62300
[3] Croux C, Ruiz-Gazen A (2005) High breakdown estimators for principal components: the projection- pursuit approach revisited. J Multivar Anal 95: 206–226 · Zbl 1065.62040
[4] Croux C, Filzmoser P, Oliveira MR (2007) Algorithms for projection-pursuit robust principal component analysis. Chemom Intell Lab Syst 87: 218–225
[5] Cui H, He X, Ng KW (2003) Asymptotic distributions of principal components based on robust dispersions. Biometrika 90: 953–966 · Zbl 1436.62222
[6] Debruyne M (2009) An outlier map for support vector machine classification. Ann Appl Stat 3(4): 1566–1580 · Zbl 1185.62112
[7] Debruyne M, Hubert M (2009) The influence function of the Stahel-Donoho covariance estimator of smallest outlyingness. Stat Probab Lett 79: 275–282 · Zbl 1169.62049
[8] Debruyne M, Hubert M, Van Horebeek J (2009a) Detecting influential observations in Kernel PCA. Comput Stat Data Anal (in press). doi: 10.1016/j.csda.2009.08.018 · Zbl 1284.62046
[9] Debruyne M, Serneels S, Verdonck T (2009b) Robustified least squares support vector classification. J Chemometrics 23(9): 479–486
[10] Donoho DL, Gasko M (1992) Breakdown properties of location estimates based on half-space depth and projected outlyingness. Ann Stat 20: 1803–1827 · Zbl 0776.62031
[11] Friedman JH, Tukey JW (1974) A projection pursuit algorithm for exploratory data analysis. IEEE Trans Comput C-23(9): 881–890 · Zbl 0284.68079
[12] Huber PJ (1985) Projection pursuit. Ann Stat 13: 435–475 · Zbl 0595.62059
[13] Hubert M, Engelen S (2004) Robust PCA and classification in biosciences. Bioinformatics 20: 1728–1736
[14] Hubert M, Van Driessen K (2004) Fast and robust discriminant analysis. Comput Stat Data Anal 45: 301–320 · Zbl 1429.62247
[15] Hubert M, Rousseeuw PJ, Verboven S (2002) A fast robust method for principal components with applications to chemometrics. Chemom Intell Lab Syst 60: 101–111
[16] Hubert M, Rousseeuw PJ, Vanden Branden K (2005) ROBPCA: a new approach to robust principal components analysis. Technometrics 47: 64–79
[17] Li G, Chen Z (1985) Projection-pursuit approach to robust dispersion matrices and principal components: primary theory and Monte Carlo. J Am Stat Assoc 80: 759–766 · Zbl 0595.62060
[18] Liu Z, Chen D, Bensmail H (2005) Gene expression data classification with kernel principal component analysis. J Biomed Biotechnol 2: 155–169
[19] Locantore N, Marron JS, Simpson DG, Tripoli N, Zhang JT, Cohen KL (1999) Robust principal component analysis for functional data. Test 8: 1–73 · Zbl 0980.62049
[20] Lu C-D, Zhang T-Y, Du X-Z, Li C-P (2004) A robust kernel PCA algorithm. Proc Int Conf Mach Learn Cybernet 5: 3084–3087
[21] Marden JI (1999) Some robust estimates of principal components. Stat Probab Lett 43: 349–359 · Zbl 0939.62055
[22] Maronna RA (2005) Principal components and orthogonal regression based on robust scales. Technometrics 47: 264–273
[23] Maronna RA, Zamar R (2002) Robust estimates of location and dispersion for high-dimensional data sets. Technometrics 44: 307–317
[24] Mika S, Rätsch G, Weston J, Schölkopf B, Müller KR (1999) Fisher discriminant analysis with kernels. In: IEEE international workshop on neural networks for signal processing IX, pp 41–48
[25] Nguyen MH, De la Torre F (2009) Robust kernel principal component analysis. Adv Neural Inf Process Syst 21: 1185–1192
[26] Ohst C (1988) Beste approximierende Kreise und ihre Eigenschaften (Best approximating spheres and their properties). Diplomarbeit in Mathematik, Institut für Statistik und Wirtschaftsmathematik, RWTH Aachen University
[27] Pollack JD, Li Q, Pearl DK (2005) Taxonomic utility of a phylogenetic analysis of phosphoglycerate kinase proteins of Archaea, Bacteria, and Eukaryota: insights by Bayesian analyses. Mol Phylogenet Evol 35: 420–430
[28] Rousseeuw PJ (1984) Least median of squares regression. J Am Stat Assoc 79: 871–880 · Zbl 0547.62046
[29] Rousseeuw PJ, Croux C (1993) Alternatives to the median absolute deviation. J Am Stat Assoc 88: 1273–1283 · Zbl 0792.62025
[30] Rousseeuw PJ, Van Driessen K (1999) Fast algorithm for the minimum covariance determinant estimator. Technometrics 41: 212–223
[31] Saigo H, Vert J, Ueda N, Akutsul T (2004) Protein homology detection using string alignment kernels. Bioinformatics 20: 1682–1689
[32] Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge · Zbl 1019.68094
[33] Schölkopf B, Smola A, Müller K-R (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10: 1299–1319
[34] Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge university press, Cambridge · Zbl 0994.68074
[35] Stahel WA (1981) Robuste Schätzungen: Infinitesimale Optimalität und Schätzungen von Kovarianzmatrizen. PhD thesis, ETH Zürich · Zbl 0531.62036
[36] Suykens JAK, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore · Zbl 1017.93004
[37] Takahashi T, Kurita T (2002) Robust de-noising by kernel PCA. In: Proceedings of the international conference on artificial neural networks. Lecture notes in computer science, vol 2415, pp 739–744 · Zbl 1013.68831
[38] Verboven S, Hubert M (2005) LIBRA: a MATLAB library for robust analysis. Chemom Intell Lab Syst 75: 127–136
[39] Yang J, Jin Z, Yang JY, Zhang D, Frangi AF (2004) Essence of kernel Fisher discriminant: KPCA plus LDA. Pattern Recognit 37: 2097–2100 · Zbl 02117452
[40] Yang J, Frangi AF, Yang JY, Zhang D, Jin Z (2005) KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition. IEEE Trans Pattern Anal Mach Intell 27: 230–244 · Zbl 05110640
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.