# zbMATH — the first resource for mathematics

Accelerated, parallel, and proximal coordinate descent. (English) Zbl 1327.65108

##### MSC:
 65K05 Numerical mathematical programming methods 90C25 Convex programming 68W20 Randomized algorithms 65Y05 Parallel numerical computation
KDD Cup
Full Text:
##### References:
 [1] A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), pp. 183–202. · Zbl 1175.94009 [2] J. K. Bradley, A. Kyrola, D. Bickson, and C. Guestrin, Parallel coordinate descent for L$$1$$-regularized loss minimization, in the 28th International Conference on Machine Learning, 2011, pp. 321–328. [3] L. El Ghaoui, V. Viallon, and T. Rabbani, Safe Feature Elimination for the Lasso, Technical report, University of California at Berkeley, Berkeley, CA, 2011. · Zbl 1259.65010 [4] O. Fercoq, Z. Qu, P. Richtárik, and M. Takáč, Fast distributed coordinate descent for minimizing non-strongly convex losses, in IEEE International Workshop on Machine Learning for Signal Processing, 2014, pp 1–6. [5] O. Fercoq and P. Richtárik, Smooth minimization of nonsmooth functions by parallel coordinate descent, arXiv:1309.5885, 2013. [6] Y. T. Lee and A. Sidford, Efficient accelerated coordinate descent methods and faster algorithms for solving linear systems, arXiv:1305.1922, 2013. [7] D. Leventhal and A. S. Lewis, Randomized methods for linear constraints: Convergence rates and conditioning, Math. Oper. Res., 35 (2010), pp. 641–654. · Zbl 1216.15006 [8] Q. Lin, Z. Lu, and L. Xiao, An accelerated proximal coordinate gradient method and its application to regularized empirical risk minimization, arXiv:1407.1296, 2014. · Zbl 1329.65127 [9] J. Liu, S. J. Wright, C. Ré, V. Bittorf, and S. Sridhar, An asynchronous parallel stochastic coordinate descent algorithm, arXiv:1311.1873, 2013. · Zbl 1337.68286 [10] I. Necoara and D. Clipici, Distributed Coordinate Descent Methods for Composite Minimization, Tech. report, Politehnica University of Bucharest, Bucharest, Romania, 2013. · Zbl 1329.90108 [11] I. Necoara and D. Clipici, Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: application to distributed MPC, J. Process Control, 23 (2013), pp. 243–253. [12] I. Necoara, Y. Nesterov, and F. Glineur, Efficiency of randomized coordinate descent methods on optimization problems with linearly coupled constraints, Tech. report, Politehnica University of Bucharest, Bucharest, Romania, 2012. [13] Y. Nesterov, A method of solving a convex programming problem with convergence rate $${O}(1/k2)$$, Soviet Math. Dokl, 27 (1983), pp. 372–376. · Zbl 0535.90071 [14] Y. Nesterov, Smooth minimization of nonsmooth functions, Math. Program., 103 (2005), pp. 127–152. · Zbl 1079.90102 [15] Y. Nesterov, Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim., 22 (2012), pp. 341–362. · Zbl 1257.90073 [16] J. C. Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods - Support Vector Learning, B. Scholkopf, C. Burges, and A. Smola, eds., MIT Press, Cambridge, MA, 1999, pp. 185–208. [17] Z. Qu, O. Fercoq, and P. Richtárik, Accelerated Coordinate Descent Method for Strongly Convex Functions, Tech. report, 05/2014, University of Edinburgh, Edinburgh, UK, 2014. [18] P. Richtárik and M. Takáč, Distributed coordinate descent method for learning with big data, arXiv:1310.2059, 2013. [19] P. Richtárik and M. Takáč, On optimal probabilities in stochastic coordinate descent methods, Optim. Lett., 2015, DOI: 10.1007/s11590-015-0916-1. [20] P. Richtárik and M. Takáč, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Math. Program., 144 (2014), pp. 1–38. · Zbl 1301.65051 [21] P. Richtárik and M. Takáč, Parallel coordinate descent methods for big data optimization problems, Math. Program., Ser. A, (2015), DOI: 10.1007/s10107-015-0901-6. [22] S. Shalev-Shwartz and A. Tewari, Stochastic methods for $$ℓ_1$$-regularized loss minimization, J. Mach. Learn. Res., 12 (2011), pp. 1865–1892. · Zbl 1280.62081 [23] S. Shalev-Shwartz and T. Zhang, Proximal stochastic dual coordinate ascent, arXiv:1211. 2717, 2012. · Zbl 1342.90103 [24] S. Shalev-Shwartz and T. Zhang, Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization, arXiv:1309.2375, 2013. · Zbl 1307.68073 [25] J. Stamper, A. Niculescu-Mizil, S. Ritter, G. J. Gordon, and K. R. Koedinger, Bridge to algebra 2008–2009. Challenge data set from KDD cup 2010 educational data mining challenge, 2010; available online at http://pslcdatashop.web.cmu.edu/KDDCup/downloads.jsp. [26] M. Takáč, A. Bijral, P. Richtárik, and N. Srebro, Mini-batch primal and dual methods for SVMs, in Proceedings of the 30th International Conference on Machine Learning, Vol. 28, 2013, pp. 1022–1030. [27] R. Tappenden, P. Richtárik, and J. Gondzio, Inexact block coordinate descent method: Complexity and preconditioning, arXiv:1304.5530, 2013. [28] P. Tseng, On accelerated proximal gradient methods for convex-concave optimization, SIAM J. Optim., 2008, submitted. [29] T. T. Wu and K. Lange, Coordinate descent algorithms for lasso penalized regression, Ann. Appl. Stat., (2008), pp. 224–244. · Zbl 1137.62045 [30] L. Xiao and Z. Lu, On the complexity analysis of randomized block-coordinate descent methods, arXiv:1305.4723, 2013. · Zbl 1321.65100
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.