×

A novel convex clustering method for high-dimensional data using semiproximal ADMM. (English) Zbl 1459.62108

Summary: Clustering is an important ingredient of unsupervised learning; classical clustering methods include K-means clustering and hierarchical clustering. These methods may suffer from instability because of their tendency prone to sink into the local optimal solutions of the nonconvex optimization model. In this paper, we propose a new convex clustering method for high-dimensional data based on the sparse group lasso penalty, which can simultaneously group observations and eliminate noninformative features. In this method, the number of clusters can be learned from the data instead of being given in advance as a parameter. We theoretically prove that the proposed method has desirable statistical properties, including a finite sample error bound and feature screening consistency. Furthermore, the semiproximal alternating direction method of multipliers is designed to solve the sparse group lasso convex clustering model, and its convergence analysis is established without any conditions. Finally, the effectiveness of the proposed method is thoroughly demonstrated through simulated experiments and real applications.

MSC:

62H30 Classification and discrimination; cluster analysis (statistical aspects)
62J07 Ridge regression; shrinkage estimators (Lasso)
62G08 Nonparametric regression and quantile regression
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Miao, J.; Zhou, X.; Huang, T.-Z., Local segmentation of images using an improved fuzzy c-means clustering algorithm based on self-adaptive dictionary learning, Applied Soft Computing, 91 (2020)
[2] Li, P.-H.; Pye, S.; Keppo, I., Using clustering algorithms to characterise uncertain long-term decarbonisation pathways, Applied Energy, 268 (2020)
[3] Xu, R.; Che, Y.; Wang, X.; Hu, J.; Xie, Y., Stacked autoencoder-based community detection method via an ensemble clustering framework, Information Sciences, 526, 151-165 (2020)
[4] MacQueen, J., Some methods for classification and analysis of multivariate observations, Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability
[5] Wu, J. J., Advances in K-Means Clustering: A Data Mining Thinking (2012), Berlin, Germany: Springer, Berlin, Germany · Zbl 1267.68018
[6] Hubert, L., Approximate evaluation techniques for the single-link and complete-link hierarchical clustering procedures, Journal of the American Statistical Association, 69, 347, 698-704 (1974) · Zbl 0291.62071
[7] Liu, Y. H.; Liu, D.; Yu, F.; Ma, Z. M., A novel local density hierarchical clustering algorithm based on reverse nearest neighbors, Mathematical Problems in Engineering, 2019 (2019)
[8] Pelckmans, K.; De Brabanter, J.; Suykens, J. A. K.; De Moor, B., Convex clustering shrinkage, Proceedings of the PASCAL Workshop on Statistics and Optimization of Clustering Workshop
[9] Hocking, T. D.; Joulin, A.; Bach, F.; Vert, J. P., Clusterpath: an algorithm for clustering using convex fusion penalties, Proceedings of the 28th International Conference on Machine Learning
[10] Chi, E. C.; Lange, K., Splitting methods for convex clustering, Journal of Computational and Graphical Statistics, 24, 4, 994-1013 (2015)
[11] Lindsten, F.; Ohlsson, H.; Ljung, L., Clustering using sum-of-norms regularization with application to particle filter output computation, Proceedings of the IEEE Statistical Signal Processing Workshop
[12] Zhu, C. B.; Xu, H.; Leng, C. L.; Yan, S. C., Convex optimization procedure for clustering: theoretical revisit, Proceedings of the Advances in Neural Information Processing Systems
[13] Panahi, A.; Dubhashi, D.; Johansson, F. D.; Bhattacharyya, C., Clustering by sum of norms: stochastic incremental algorithm, convergence and cluster recovery, Proceedings of the 34th International Conference on Machine Learning
[14] Sun, D. F.; Toh, K. C.; Yuan, Y. C., Convex clustering: model, theoretical guarantee and efficient algorithm (2018), https://arxiv.org/abs/1810.02677
[15] Tan, K. M.; Witten, D., Statistical properties of convex clustering, Electronic Journal of Statistics, 9, 2, 2324-2347 (2015) · Zbl 1336.62193
[16] Radchenko, P.; Mukherjee, G., Convex clustering vial1fusion penalization, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79, 5, 1527-1546 (2017) · Zbl 1381.62193
[17] Chi, E. C.; Steinerberger, S., Recovering trees with convex clustering, SIAM Journal on Mathematics of Data Science, 1, 3, 383-407 (2019)
[18] Yuan, Y. C.; Sun, D. F.; Toh, K. C., An efficient semismooth Newton based algorithm for convex clustering, Proceedings of the 35th International Conference on Machine Learning
[19] Poddar, S.; Jacob, M., Convex clustering and recovery of partially observed data, Proceedings of IEEE International Conference on Image Processing
[20] Choi, H.; Lee, S., Convex clustering for binary data, Advances in Data Analysis and Classification, 13, 4, 991-1018 (2019) · Zbl 1459.62109
[21] Sui, X. L.; Xu, L.; Qian, X.; Liu, T., Convex clustering with metric learning, Pattern Recognition, 81, 575-584 (2018)
[22] Wang, B.; Zhang, Y.; Sun, W. W.; Fang, Y., Sparse convex clustering, Journal of Computational and Graphical Statistics, 27, 2, 393-403 (2018)
[23] Noah, S.; Friedman, J.; Hastie, T.; Tibshirani, R., A sparse group lasso, Journal of Computational & Graphical Statistics, 22, 2, 213-245 (2013)
[24] Zou, H.; Hastie, T., Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67, 2, 301-320 (2005) · Zbl 1069.62054
[25] Yuan, M.; Lin, Y., Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68, 1, 49-67 (2006) · Zbl 1141.62030
[26] Wang, H.; Leng, C., A note on adaptive group lasso, Computational Statistics & Data Analysis, 52, 12, 5277-5286 (2008) · Zbl 1452.62524
[27] Tibshirani, R., Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society: Series B (Methodological), 58, 1, 267-288 (1996) · Zbl 0850.62538
[28] Hastie, T.; Tibshirani, R.; Wainwright, M., Statistical Learning with Sparsity: The Lasso and Generalizations (2015), Boca Raton, FL, USA: CRC Press, Taylor & Francis Group, Boca Raton, FL, USA · Zbl 1319.68003
[29] Rockafellar, T. R., Convex Analysis (1970), Princeton, NJ, USA: Princeton University Press, Princeton, NJ, USA · Zbl 0202.14303
[30] Wang, L.; Wu, Y.; Li, R., Quantile regression for analyzing heterogeneity in ultra-high dimension, Journal of the American Statistical Association, 107, 497, 214-222 (2012) · Zbl 1328.62468
[31] Glowinski, R.; Marroco, A., Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de dirichlet non linéaires, Revue Française D’automatique, Informatique, Recherche Opérationnelle. Analyse Numérique, 9, R2, 41-76 (1975) · Zbl 0368.65053
[32] Gabay, D.; Mercier, B., A dual algorithm for the solution of nonlinear variational problems via finite element approximation, Computers & Mathematics with Applications, 2, 1, 17-40 (1976) · Zbl 0352.65034
[33] Fazel, M.; Pong, T. K.; Sun, D. F.; Tseng, P., Hankel matrix rank minimization with applications to system identification and realization, SIAM Journal on Matrix Analysis & Applications, 34, 3, 946-977 (2012) · Zbl 1302.90127
[34] Golub, G. H.; Van Loan, C. F., Matrix Computations (1996), Baltimore, MD, USA: Johns Hopkins University Press, Baltimore, MD, USA · Zbl 0865.65009
[35] Mordukhovich, B. S.; Nam, N. M., An Easy Path to Convex Analysis and Applications (2014), San Rafael, CA, USA: Morgan & Claypool Publishers, San Rafael, CA, USA · Zbl 1284.49002
[36] Zou, H., The adaptive lasso and its oracle properties, Journal of the American Statistical Association, 101, 476, 1418-1429 (2006) · Zbl 1171.62326
[37] Rand, W. M., Objective criteria for the evaluation of clustering methods, Journal of the American Statistical Association, 66, 336, 846-850 (1971)
[38] Fowlkes, E. B.; Mallows, C. L., A method for comparing two hierarchical clusterings, Journal of the American Statistical Association, 78, 383, 553-569 (1983) · Zbl 0545.62042
[39] Dettling, M., Bagboosting for tumor classification with gene expression data, Bioinformatics, 20, 18, 3583-3593 (2004)
[40] Pomeroy, S. L.; Tamayo, P.; Gaasenbeek, M., Prediction of central nervous system embryonal tumour outcome based on gene expression, Nature, 415, 6870, 436-442 (2002)
[41] Khan, J.; Wei, J. S.; Ringnér, M., Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks, Nature Medicine, 7, 6, 673-679 (2001)
[42] Singh, D.; Febbo, P. G.; Ross, K., Gene expression correlates of clinical prostate cancer behavior, Cancer Cell, 1, 2, 203-209 (2002)
[43] Alon, U.; Barkai, N.; Notterman, D. A., Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays, Proceedings of the National Academy of Sciences, 96, 12, 6745-6750 (1999)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.