# zbMATH — the first resource for mathematics

A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints. (English) Zbl 1304.90160
Summary: In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an $$\epsilon$$-optimal solution in $$\mathcal{O}(n^{2}/\epsilon)$$ iterations, where $$n$$ is the number of blocks. For the class of problems with cheap coordinate derivatives we show that the new method is faster than methods based on full-gradient information. Analysis for the rate of convergence in probability is also provided. For strongly convex functions our method converges linearly. Extensive numerical tests confirm that on very large problems, our method is much more numerically efficient than methods based on full gradient information.

##### MSC:
 90C25 Convex programming
LIBSVM; PDCO
Full Text:
##### References:
 [1] Beck, A., Tetruashvili, L.: On the convergence of block coordinate descent type methods. Tecnical report, Technion (2012) · Zbl 1297.90113 [2] Berman, P.; Kovoor, N.; Pardalos, P.M.; Pardalos, P.M. (ed.), Algorithms for least distance problem, 33-56, (1993), Singapore · Zbl 0968.90501 [3] Bertsekas, D.P.: Parallel and Distributed Computation: Numerical Methods. Athena Scientific, Nashua (2003) [4] Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Nashua (1999) · Zbl 1015.90077 [5] Candes, E.; Romberg, J.; Tao, T., Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information, IEEE Trans. Inf. Theory, 52, 489-509, (2006) · Zbl 1231.94017 [6] Chen, S.; Donoho, D.; Saunders, M., Atomic decomposition by basis pursuit, SIAM Rev., 43, 129-159, (2001) · Zbl 0979.94010 [7] Chang, C.C.; Lin, C.J., LIBSVM: a library for support vector machines, ACM Trans. Intell. Syst. Technol., 27, 1-27, (2011) [8] Dai, Y.H.; Fletcher, R., New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds, Math. Program., 106, 403-421, (2006) · Zbl 1134.90030 [9] Ferris, M.C.; Munson, T.S., Interior-point methods for massive support vector machines, SIAM J. Optim., 13, 783-804, (2003) · Zbl 1039.90092 [10] Hush, D.; Kelly, P.; Scovel, C.; Steinwart, I., QP algorithms with guaranteed accuracy and run time for support vector machines, J. Mach. Learn. Res., 7, 733-769, (2006) · Zbl 1222.68221 [11] Judice, J.; Raydan, M.; Rosa, S.; Santos, S., On the solution of the symmetric eigenvalue complementarity problem by the spectral projected gradient algorithm, Numer. Algorithms, 47, 391-407, (2008) · Zbl 1144.65042 [12] Kiwiel, K.C., On linear-time algorithms for the continuous quadratic knapsack problem, J. Optim. Theory Appl., 134, 549-554, (2007) · Zbl 1145.90077 [13] Lin, C.J.; Lucidi, S.; Palagi, L.; Risi, A.; Sciandrone, M., A decomposition algorithm model for singly linearly constrained problems subject to lower and upper bounds, J. Optim. Theory Appl., 141, 107-126, (2009) · Zbl 1168.90556 [14] List, N., Simon, H.U.: General Polynomial Time Decomposition Algorithms. Lecture Notes in Computer Science, vol. 3559, pp. 308-322. Springer, Berlin (2005) · Zbl 1127.68082 [15] Necoara, I.; Nedelcu, V.; Dumitrache, I., Parallel and distributed optimization methods for estimation and control in networks, J. Process Control, 21, 756-766, (2011) [16] Necoara, I., Nesterov, Y., Glineur, F.: A random coordinate descent method on large optimization problems with linear constraints. Technical report, University Politehnica Bucharest (2011) http://acse.pub.ro/person/ion-necoara · Zbl 1370.90145 [17] Necoara, I., Random coordinate descent algorithms for multi-agent convex optimization over networks, IEEE Trans. Autom. Control, 58, 1-12, (2013) [18] Necoara, I.; Clipici, D., Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: application to distributed MPC, J. Process Control, 23, 243-253, (2013) [19] Nesterov, Y., Shpirko, S.: Primal-dual subgradient method for huge-scale linear conic problems (2012). http://www.optimization-online.org/DB_FILE/2012/08/3590.pdf · Zbl 1327.90217 [20] Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer Academic, Norwell (2004) · Zbl 1086.90045 [21] Nesterov, Y.: Gradient methods for minimizing composite objective functions. Core discussion paper, 76/2007, Universite Catholique de Louvain (2007) [22] Nesterov, Y., Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim., 22, 341-362, (2012) · Zbl 1257.90073 [23] Platt, J.C.: Fast Training of Support Vector Machines Using Sequential Minimal Optimization. Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge (1999) [24] Qin, Z., Scheinberg, K., Goldfarb, D.: Efficient block-coordinate descent algorithms for the group Lasso (2010), submitted · Zbl 1275.90059 [25] Richtarik, P.; Takac, M., Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Math. Program., Ser. A, (2012) · Zbl 1237.90183 [26] Richtarik, P., Takac, M.: Efficient serial and parallel coordinate descent methods for huge-scale truss topology design. Oper. Res. Proc., 27-32 (2012) · Zbl 1306.74043 [27] Richtarik, P., Takac, M.: Parallel coordinate descent methods for big data optimization. Technical report (2012). arXiv:1212.0873 · Zbl 1342.90102 [28] Rockafellar, R.T.; Bose, R.C. (ed.); Downling, T.A. (ed.), The elementary vectors of a subspace in $$\mathbb {R}^{N}$$, 104-127, (1969), Chapel Hill [29] Rockafellar, R.T.: Network Flows and Monotropic Optimization. Wiley-Interscience, New York (1984) · Zbl 0596.90055 [30] Saha, A.; Tewari, A., On the finite time convergence of cyclic coordinate descent methods, SIAM J. Optim., 23, 576-601, (2013) · Zbl 1270.90032 [31] Tappenden, R., Richtarik, P., Gondzio, J.: Inexact coordinate descent: complexity and preconditioning (2013). arXiv:1304.5530 · Zbl 1350.65062 [32] Tseng, P.; Yun, S., A coordinate gradient descent method for nonsmooth separable minimization, Math. Program., 117, 387-423, (2009) · Zbl 1166.90016 [33] Tseng, P.; Yun, S., A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training, Comput. Optim. Appl., 47, 179-206, (2010) · Zbl 1226.90062 [34] Tseng, P.; Yun, S., A block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization, J. Optim. Theory Appl., 140, 513-535, (2009) · Zbl 1190.90279 [35] Yuan, M.; Lin, Y., Model selection and estimation in regression with grouped variables, J. R. Stat. Soc. B, 68, 49-67, (2006) · Zbl 1141.62030 [36] Xiao, L.; Boyd, S., Optimal scaling of a gradient method for distributed resource allocation, J. Optim. Theory Appl., 129, 469-488, (2006) · Zbl 1330.90136 [37] Xu, S.; Freund, M.; Sun, J., Solution methodologies for the smallest enclosing circle problem, Comput. Optim. Appl., 25, 283-292, (2003) · Zbl 1038.90080
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.