Nesterov, Yu. Efficiency of coordinate descent methods on huge-scale optimization problems. (English) Zbl 1257.90073 SIAM J. Optim. 22, No. 2, 341-362 (2012). For the unconstrained minimization of a differentiable convex function \(f(x_{1},\dots,x_{n})\) (\(x_{i}\in \mathbb{R}^{n_{i}}\), \(i=1,\dots,n\)) with a globally Lipschitz gradient, the random coordinate descent method consists in randomly choosing, at iteration \(k\), some \(i_{k}\in \{1,\dots,n\}\) and then updating the current iterate by making a step in the direction of the negative of the partial gradient with respect to \(x_{i_{k}}\), the step size being equal to the reciprocal of the Lipschitz constant of this partial gradient. The expected objective function value is shown to converge to the infimum of \(f\); for strongly convex functions, the rate of convergence is linear and an accelerated version is presented. A modification of the method for constrained problems is also introduced. Implementation issues are discussed, and some preliminary numerical experiments are reported. Reviewer: Juan-Enrique MartĂnez-Legaz (Barcelona) Cited in 4 ReviewsCited in 208 Documents MSC: 90C25 Convex programming 65K05 Numerical mathematical programming methods Keywords:ordinate relaxation PDF BibTeX XML Cite \textit{Yu. Nesterov}, SIAM J. Optim. 22, No. 2, 341--362 (2012; Zbl 1257.90073) Full Text: DOI Link OpenURL