# zbMATH — the first resource for mathematics

A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression. (English) Zbl 1461.90091
Summary: In this paper, we propose a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas-Rachford splitting method. Our algorithm performs both function and variable splittings. It sweeps the training set by randomly selecting a mini-batch of data at each iteration, and it allows us to update the variables in a block coordinate manner. Our approach leverages the proximity operator of the logistic loss, which is expressed with the generalized Lambert W function. Experiments carried out on standard datasets demonstrate the efficiency of our approach w. r. t. stochastic gradient-like methods.

##### MSC:
 90C25 Convex programming 90C90 Applications of mathematical programming
UNLocBoX
Full Text:
##### References:
 [1] Rosasco, L.; Vito, E.; Caponnetto, A.; Piana, M.; Verri, A., Are loss functions all the same?, Neural Comput., 16, 1063-1076, (2004) · Zbl 1089.68109 [2] Bartlett, PL; Jordan, MI; McAuliffe, JD, Convexity, classification, and risk bounds, J. Am. Stat. Assoc., 101, 138-156, (2006) · Zbl 1118.62330 [3] Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: Proceedings of ICML, Madison, USA, pp. 82-90 (1998) [4] Weston, J.; Elisseeff, A.; Schölkopf, B.; Tipping, M., Use of the zero-norm with linear models and kernel methods, Mach. Learn., 3, 1439-1461, (2002) · Zbl 1102.68605 [5] Liu, Y.; Helen Zhang, H.; Park, C.; Ahn, J., Support vector machines with adaptive Lq penalty, Comput. Stat. Data Anal., 51, 6380-6394, (2007) · Zbl 1446.62179 [6] Zou, H.; Yuan, M., The $$\text{f}_\infty$$-norm support vector machine, Stat. Sin., 18, 379-398, (2008) · Zbl 1416.62370 [7] Laporte, L.; Flamary, R.; Canu, S.; Déjean, S.; Mothe, J., Non-convex regularizations for feature selection in ranking with sparse SVM, IEEE Trans. Neural Netw. Learn. Syst., 25, 1118-1130, (2014) [8] Krishnapuram, B.; Carin, L.; Figueiredo, MAT; Hartemink, AJ, Sparse multinomial logistic regression: fast algorithms and generalization bounds, IEEE Trans. Pattern Anal. Mach. Intell., 27, 957-968, (2005) [9] Meier, L.; Geer, S.; Bühlmann, P., The group lasso for logistic regression, J. R. Stat. Soc. B, 70, 53-71, (2008) · Zbl 1400.62276 [10] Duchi, J., Singer, Y.: Boosting with structural sparsity. In: Proceedings of ICML, Montreal, Canada, pp. 297-304 (2009) [11] Obozinski, G.; Taskar, B.; Jordan, MI, Joint covariate selection and joint subspace selection for multiple classification problems, Stat. Comput., 20, 231-252, (2010) [12] Yuan, G-X; Chang, K-W; Hsieh, C-J; Lin, C-J, A comparison of optimization methods and software for large-scale L1-regularized linear classification, Mach. Learn., 11, 3183-3234, (2010) · Zbl 1242.62065 [13] Rosasco, L.; Villa, S.; Mosci, S.; Santoro, M.; Verri, A., Nonparametric sparsity and regularization, J. Mach. Learn. Res., 14, 1665-1714, (2013) · Zbl 1317.68183 [14] Blondel, M., Fujino, A., Ueda, N.: Large-scale multiclass support vector machine training via euclidean projection onto the simplex. In: Proceedings of ICPR, Stockholm, Sweden, 24-28 August 2014, pp. 1289-1294 [15] Wang, L.; Shen, X., On $$l_1$$-norm multi-class support vector machines: methodology and theory, J. Am. Stat. Assoc., 102, 583-594, (2007) · Zbl 1172.62317 [16] Mairal, J.: Optimization with first-order surrogate functions. In: Proceedings of ICML, pp. 783-791 (2013) [17] Chierchia, G., Pustelnik, N., Pesquet, J.-C., Pesquet-Popescu, B.: A proximal approach for sparse multiclass SVM (2015). Preprint arXiv:1501.03669 · Zbl 1374.94069 [18] Barlaud, M.; Belhajali, W.; Combettes, PL; Fillatre, L., Classification and regression using a constrained convex splitting method, IEEE Trans. Signal Process., 65, 4635-4644, (2017) · Zbl 1414.94052 [19] Iutzeler, F.; Malick, J., On the proximal gradient algorithm with alternated inertia, J. Optim. Theory Appl., 176, 688-710, (2018) · Zbl 06872632 [20] Tan, M., Wang, L., Tsang, I.W.: Learning sparse SVM for feature selection on very high dimensional datasets. In: Proceedings of ICML, Haifa, Israel, 21-24 June 2010, pp. 1047-1054 [21] Hsieh, C.J., Chang, K.W., Lin, C.J., Keerthi, S.S., Sundararajan, S.: A dual coordinate descent method for large-scale linear SVM. In: Proceedings of ICML, pp. 408-415 (2008) [22] Lacoste-Julien, S.; Jaggi, M.; Schmidt, M.; Pletscher, P., Block-coordinate Frank-Wolfe optimization for structural SVMs, Proc. ICML, 28, 53-61, (2013) [23] Pereyra, M.; Schniter, P.; Chouzenoux, E.; Pesquet, J-C; Tourneret, J-Y; Hero, A.; McLaughlin, S., A survey of stochastic simulation and optimization methods in signal processing, IEEE J. Sel. Top. Signal Process., 10, 224-241, (2016) [24] Shalev-Shwartz, S.; Tewari, A., Stochastic methods for l1-regularized loss minimization, J. Mach. Learn. Res., 12, 1865-1892, (2011) · Zbl 1280.62081 [25] Blondel, M.; Seki, K.; Uehara, K., Block coordinate descent algorithms for large-scale sparse multiclass classification, Mach. Learn., 93, 31-52, (2013) · Zbl 1293.68216 [26] Fercoq, O.; Richtárik, P., Accelerated, parallel, and proximal coordinate descent, SIAM J. Optim., 25, 1997-2023, (2015) · Zbl 1327.65108 [27] Lu, Z.; Xiao, L., On the complexity analysis of randomized block-coordinate descent methods, Math. Program., 152, 615-642, (2015) · Zbl 1321.65100 [28] Langford, J.; Li, L.; Zhang, T., Sparse online learning via truncated gradient, J. Mach. Learn. Res., 10, 777-801, (2009) · Zbl 1235.68167 [29] Xiao, L., Dual averaging methods for regularized stochastic learning and online optimization, J. Mach. Learn. Res., 11, 2543-2596, (2010) · Zbl 1242.62011 [30] Richtárik, P.; Takáč, M., Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Math. Program., 144, 1-38, (2014) · Zbl 1301.65051 [31] Rosasco, L.; Villa, S.; Vũ, BC, Stochastic forward-backward splitting for monotone inclusions, J. Optim. Theory Appl., 169, 388-406, (2016) · Zbl 06586792 [32] Combettes, P.; Pesquet, J-C, Stochastic approximations and perturbations in forward-backward splitting for monotone operators, Pure Appl. Funct. Anal., 1, 13-37, (2016) · Zbl 1377.90109 [33] Combettes, L., Pesquet, J.-C.: Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping II: mean-square and linear convergence. Math. Program. (2018) [34] Combettes, PL; Pesquet, J-C, Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping, SIAM J. Optim., 25, 1221-1248, (2015) · Zbl 1317.65135 [35] Pesquet, J-C; Repetti, A., A class of randomized primal-dual algorithms for distributed optimization, J. Nonlinear Convex Anal., 16, 2453-2490, (2015) · Zbl 1336.65113 [36] Mairal, J.: Stochastic majorization-minimization algorithms for large-scale optimization. In: Proceedings of NIPS, pp. 2283-2291 (2013) [37] Chouzenoux, E.; Pesquet, J-C, A stochastic majorize-minimize subspace algorithm for online penalized least squares estimation, IEEE Trans. Signal Process., 65, 4770-4783, (2017) · Zbl 1414.94135 [38] Briceño-Arias, LM; Combettes, PL, A monotone + skew splitting model for composite monotone inclusions in duality, SIAM J. Optim., 21, 1230-1250, (2011) · Zbl 1239.47053 [39] Boţ, RI; Hendrich, C., A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators, SIAM J. Optim., 23, 2541-2565, (2013) · Zbl 1295.47066 [40] Perekrestenko, D., Cevher, V., Jaggi, M.: Faster coordinate descent via adaptive importance sampling. In: Proceedings of AISTATS, Fort Lauderdale, Florida, USA, 20-22 April (2017) [41] Mező, I.; Baricz, Á, On the generalization of the Lambert W function, Trans. Am. Math. Soc., 369, 7917-7934, (2017) · Zbl 1375.33034 [42] Maignan, A.; Scott, T., Fleshing out the generalized Lambert W function, ACM Commun. Comput. Algebra, 50, 45-60, (2016) · Zbl 1367.33017 [43] Chierchia, G., Pustelnik, N., Pesquet, J.-C.: Random primal-dual proximal iterations for sparse multiclass SVM. In: Proceedings of MLSP, Salerno, Italy (2016) [44] Atchadé, YF; Fort, G.; Moulines, E., On perturbed proximal gradient algorithms, J. Mach. Learn. Res., 18, 310-342, (2017) · Zbl 1433.90199 [45] Combettes, P., Glaudin, L.: Proximal activation of smooth functions in splitting algorithms for convex minimization. Tech. Rep. (2018). arxiv:1803.02919 [46] Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces, 2nd edn. Springer, New York (2017) · Zbl 1359.26003 [47] Eckstein, J., A simplified form of block-iterative operator splitting and an asynchronous algorithm resembling the multi-block alternating direction method of multipliers, J. Optim. Theory Appl., 173, 155-182, (2017) · Zbl 06768800 [48] Johnstone, P.R., Eckstein, J.: Projective splitting with forward steps: asynchronous and block-iterative operator splitting. Tech. Rep. (2018). https://arxiv.org/pdf/1803.07043.pdf [49] Combettes, PL; Eckstein, J., Asynchronous block-iterative primal-dual decomposition methods for monotone inclusions, Math. Program. Ser. B, 168, 645-672, (2018) · Zbl 06857174 [50] Chierchia, G., Cherni, A., Chouzenoux, E., Pesquet, J.-C.: Approche de Douglas-Rachford aléatoire par blocs appliquée à la régression logistique parcimonieuse. In: Actes du GRETSI. Juan-les-Pins, France (2017) [51] Cortes, C.; Vapnik, V., Support-vector networks, Mach. Learn., 20, 273-297, (1995) · Zbl 0831.68098 [52] Martins, A., Astudillo, R.: From softmax to sparsemax: a sparse model of attention and multi-label classification. In: Proceedings of ICML, New York, USA, pp. 1614-1623 (2016) [53] Bach, F.; Jenatton, R.; Mairal, J.; Obozinski, G., Optimization with sparsity-inducing penalties, Found. Trends Mach. Learn., 4, 1-106, (2012) · Zbl 06064248 [54] Chierchia, G., Chouzenoux, E., Combettes, P.L., Pesquet, J.-C.: The proximity operator repository (user’s guide). http://proximity-operator.net/ (2017). Accessed 10 Jan 2019 [55] Combettes, PL; Pesquet, J-C; Bauschke, HH (ed.); Burachik, RS (ed.); Combettes, PL (ed.); Elser, V. (ed.); Luke, DR (ed.); Wolkowicz, H. (ed.), Proximal splitting methods in signal processing, 185-212, (2011), New York [56] Parikh, N.; Boyd, S., Proximal algorithms, Found. Trends Optim., 1, 123-231, (2014) [57] Wang, P.-W., Wytock, M., Kolter, J.Z.: Epigraph projections for fast general convex programming. In: Proceedings of ICML (2016) [58] Rifkin, R.; Klautau, A., In defense of one-vs-all classification, Mach. Learn., 5, 101-141, (2004) · Zbl 1222.68287
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.