# zbMATH — the first resource for mathematics

Efficiency of coordinate descent methods on huge-scale optimization problems. (English) Zbl 1257.90073
For the unconstrained minimization of a differentiable convex function $$f(x_{1},\dots,x_{n})$$ ($$x_{i}\in \mathbb{R}^{n_{i}}$$, $$i=1,\dots,n$$) with a globally Lipschitz gradient, the random coordinate descent method consists in randomly choosing, at iteration $$k$$, some $$i_{k}\in \{1,\dots,n\}$$ and then updating the current iterate by making a step in the direction of the negative of the partial gradient with respect to $$x_{i_{k}}$$, the step size being equal to the reciprocal of the Lipschitz constant of this partial gradient. The expected objective function value is shown to converge to the infimum of $$f$$; for strongly convex functions, the rate of convergence is linear and an accelerated version is presented. A modification of the method for constrained problems is also introduced. Implementation issues are discussed, and some preliminary numerical experiments are reported.

##### MSC:
 90C25 Convex programming 65K05 Numerical mathematical programming methods
##### Keywords:
ordinate relaxation
Full Text: