×

Operator splitting performance estimation: tight contraction factors and optimal parameter selection. (English) Zbl 1511.90322

Summary: We propose a methodology for studying the performance of common splitting methods through semidefinite programming. We prove tightness of the methodology and demonstrate its value by presenting two applications of it. First, we use the methodology as a tool for computer-assisted proofs to prove tight analytical contraction factors for Douglas-Rachford splitting that are likely too complicated for a human to find bare-handed. Second, we use the methodology as an algorithmic tool to computationally select the optimal splitting method parameters by solving a series of semidefinite programs.

MSC:

90C22 Semidefinite programming
68Q25 Analysis of algorithms and problem complexity
90C25 Convex programming
90C60 Abstract computational complexity for mathematical programming problems
47H05 Monotone operators and generalizations
47H09 Contraction-type mappings, nonexpansive mappings, \(A\)-proper mappings, etc.

Software:

YALMIP; Mosek; PESTO
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] H. H. Bauschke, Fenchel duality, Fitzpatrick functions and the extension of firmly nonexpansive mappings, Proc. Amer. Math. Soc., 135 (2007), pp. 135-139. · Zbl 1181.49017
[2] H. H. Bauschke and P. L. Combettes, Convex Analysis and Monotone Operator Theory in Hilbert Spaces, 2nd ed., Springer, 2017. · Zbl 1359.26003
[3] H. H. Bauschke and X. Wang, Firmly Nonexpansive and Kirszbraun-Valentine extensions: A constructive approach via monotone operator theory, in Nonlinear Analysis and Optimization I: Nonlinear Analysis, American Mathematics Society, 2010, pp. 55-64. · Zbl 1247.47028
[4] H. H. Bauschke, X. Wang, and L. Yao, General resolvents for monotone operators: Characterization and extension, in Biomedical Mathematics: Promising Directions in Imaging, Therapy Planning, and Inverse Problems, Medical Physics Publishing, 2010, pp. 57-74.
[5] L. M. Bricen͂o-Arias, Forward-Douglas-Rachford splitting and forward-partial inverse method for solving monotone inclusions, Optimization, 64 (2015), pp. 1239-1261. · Zbl 1310.47084
[6] L. M. Bricen͂o-Arias and D. Davis, Forward-backward-half forward algorithm for solving monotone inclusions, SIAM J. Optim., 28 (2018), pp. 2839-2871, https://doi.org/10.1137/17M1120099. · Zbl 06951769
[7] R. E. Bruck, On the weak convergence of an ergodic iteration for the solution of variational inequalities for monotone operators in Hilbert space, J. Math. Anal. Appl., 61 (1977), pp. 159-164. · Zbl 0423.47023
[8] G. H-G. Chen and R. T. Rockafellar, Convergence rates in forward-backward splitting, SIAM J. Optim., 7 (1997), pp. 421-444, https://doi.org/10.1137/S1052623495290179. · Zbl 0876.49009
[9] L. Chen, X. Li, D. Sun, and K.-C. Toh, On the equivalence of inexact proximal ALM and ADMM for a class of convex composite programming, Math. Program., 2019; published online August 26, 2019.
[10] P. Chen, J. Huang, and X. Zhang, A primal-dual fixed point algorithm for minimization of the sum of three convex separable functions, Fixed Point Theory Appl., 2016 (2016), 54. · Zbl 1505.90094
[11] J.-P. Crouzeix and E. O. Anaya, Maximality is nothing but continuity, J. Convex Anal., 17 (2010), pp. 521-534. · Zbl 1191.49012
[12] J.-P. Crouzeix and E. O. Anaya, Monotone and maximal monotone affine subspaces, Oper. Res. Lett., 38 (2010), pp. 139-142. · Zbl 1185.90196
[13] J.-P. Crouzeix, E. O. Anaya, and W. Sosa, A construction of a maximal monotone extension of a monotone map, ESAIM Proc., 20 (2007), pp. 93-104. · Zbl 1359.47047
[14] E. R. Csetnek, Y. Malitsky, and M. K. Tam, Shadow Douglas-Rachford splitting for monotone inclusions, Appl. Math. Optim., 80 (2019), pp. 665-678. · Zbl 1447.47051
[15] D. Davis and W. Yin, Faster convergence rates of relaxed Reaceman-Rachford and ADMM under regularity assumptions, Math. Oper. Res., 42 (2017), pp. 783-805. · Zbl 1379.65035
[16] D. Davis and W. Yin, A three-operator splitting scheme and its optimization applications, Set-Valued Var. Anal., 25 (2017), pp. 829-858. · Zbl 1464.47041
[17] W. Deng and W. Yin, On the global and linear convergence of the generalized alternating direction method of multipliers, J. Sci. Comput., 66 (2016), pp. 889-916. · Zbl 1379.65036
[18] J. Douglas and H. H. Rachford, On the numerical solution of heat conduction problems in two and three space variables, Trans. Amer. Math. Soc., 82 (1956), pp. 421-439. · Zbl 0070.35401
[19] Y. Drori, The exact information-based complexity of smooth convex minimization, J. Complexity, 39 (2017), pp. 1-16. · Zbl 1357.68072
[20] Y. Drori and M. Teboulle, Performance of first-order methods for smooth convex minimization: A novel approach, Math. Program., 145 (2014), pp. 451-482. · Zbl 1300.90068
[21] G. França and J. Bento, An explicit rate bound for over-relaxed ADMM, in Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT), IEEE, 2016, pp. 2104-2108.
[22] E. Ghadimi, A. Teixeira, I. Shames, and M. Johansson, On the optimal step-size selection for the alternating direction method of multipliers, IFAC Proc. Vol., 45 (2012), pp. 139-144.
[23] E. Ghadimi, A. Teixeira, I. Shames, and M. Johansson, Optimal parameter selection for the alternating direction method of multipliers (ADMM): Quadratic problems, IEEE Trans. Automat. Control, 60 (2015), pp. 644-658. · Zbl 1360.90182
[24] P. Giselsson, Tight linear convergence rate bounds for Douglas-Rachford splitting and ADMM, in Proceedings of the 54th Conference on Decision and Control, Osaka, Japan, 2015, pp. 3305-3310.
[25] P. Giselsson, Tight global linear convergence rate bounds for Douglas-Rachford splitting, J. Fixed Point Theory Appl., 19 (2017), pp. 2241-2270. · Zbl 1380.65104
[26] P. Giselsson and S. Boyd, Diagonal scaling in Douglas-Rachford splitting and ADMM, in Proceedings of the 53rd IEEE Conference on Decision and Control, Los Angeles, CA, 2014, pp. 5033-5039.
[27] P. Giselsson and S. Boyd, Linear convergence and metric selection for Douglas-Rachford splitting and ADMM, IEEE Trans. Automat. Control, 62 (2017), pp. 532-544. · Zbl 1364.90256
[28] A. A. Goldstein, Convex programming in Hilbert space, Bull. Amer. Math. Soc., 70 (1964), pp. 709-710. · Zbl 0142.17101
[29] G. Gu and J. Yang, On the Optimal Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Variational Inequalities, preprint, https://arxiv.org/abs/1905.06030, 2019.
[30] G. Gu and J. Yang, On the Optimal Linear Convergence Factor of the Relaxed Proximal Point Algorithm for Monotone Inclusion Problems, preprint, https://arxiv.org/abs/1905.04537, 2019.
[31] G. Gu and J. Yang, Optimal Nonergodic Sublinear Convergence Rate of Proximal Point Algorithm for Maximal Monotone Inclusion Problems, preprint, https://arxiv.org/abs/1904.05495, 2019.
[32] D. Han, D. Sun, and L. Zhang, Linear rate convergence of the alternating direction method of multipliers for convex composite programming, Math. Oper. Res., 43 (2018), pp. 622-637. · Zbl 1440.90047
[33] M. Hong and Z.-Q. Luo, On the linear convergence of the alternating direction method of multipliers, Math. Program., 162 (2017), pp. 165-199. · Zbl 1362.90313
[34] R. B. Kellogg, A nonlinear alternating direction method, Math. Comp., 23 (1969), pp. 23-27. · Zbl 0185.39902
[35] D. Kim and J. A. Fessler, Optimized first-order methods for smooth convex minimization, Math. Program., 159 (2016), pp. 81-107. · Zbl 1345.90113
[36] D. Kim and J. A. Fessler, On the convergence analysis of the optimized gradient method, J. Optim. Theory Appl., 172 (2017), pp. 187-205. · Zbl 1360.90200
[37] D. Kim and J. A. Fessler, Adaptive restart of the optimized gradient method for convex optimization, J. Optim. Theory Appl., 178 (2018), pp. 240-263. · Zbl 1406.90093
[38] D. Kim and J. A. Fessler, Another look at the fast iterative shrinkage/thresholding algorithm (FISTA), SIAM J. Optim., 28 (2018), pp. 223-250, https://doi.org/10.1137/16M108940X. · Zbl 1391.90476
[39] D. Kim and J. A. Fessler, Generalizing the optimized gradient method for smooth convex minimization, SIAM J. Optim., 28 (2018), pp. 1920-1950, https://doi.org/10.1137/17M112124X. · Zbl 1400.90245
[40] D. Kim and J. A. Fessler, Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions, preprint, https://arxiv.org/abs/1803.06600, 2018.
[41] M. Kirszbraun, Über die zusammenziehende und Lipschitzsche transformationen, Fund. Math., 22 (1934), pp. 77-108. · JFM 60.0532.03
[42] G. M. Korpelevič, An extragradient method for finding saddle points and for other problems, Èkonom. i Mat. Metody, 12 (1976), pp. 747-756 (in Russian). · Zbl 0342.90044
[43] L. Lessard, B. Recht, and A. Packard, Analysis and design of optimization algorithms via integral quadratic constraints, SIAM J. Optim., 26 (2016), pp. 57-95, https://doi.org/10.1137/15M1009597. · Zbl 1329.90103
[44] E. S. Levitin and B. T. Polyak, Constrained minimization methods, Zh. Vychisl. Mat. Mat. Fiz., 6 (1966), pp. 787-823 (in Russian). · Zbl 0184.38902
[45] F. Lieder, On the Convergence Rate of the Halpern-Iteration, preprint 2017-11-6336, Optimization Online, 2017.
[46] T. Lin, S. Ma, and S. Zhang, An extragradient-based alternating direction method for convex minimization, Found. Comput. Math., 17 (2017), pp. 35-59. · Zbl 1364.90259
[47] P. L. Lions and B. Mercier, Splitting algorithms for the sum of two nonlinear operators, SIAM J. Numer. Anal., 16 (1979), pp. 964-979, https://doi.org/10.1137/0716071. · Zbl 0426.65050
[48] J. Löfberg, YALMIP: A toolbox for modeling and optimization in MATLAB, in Proceedings of the CACSD Conference, Taipei, Taiwan, 2004.
[49] Y. Malitsky, Golden ratio algorithms for variational inequalities, Math. Program., 2019; published online July 31, 2019.
[50] Y. Malitsky and M. K. Tam, A Forward-Backward Splitting Method for Monotone Inclusions without Cocoercivity, preprint, https://arxiv.org/abs/1808.04162, 2018. · Zbl 1445.47041
[51] B. Mercier, Inéquations variationnelles de la mécanique, Université de Paris-Sud, Département de mathématique, 1980. · Zbl 0444.73024
[52] MOSEK ApS, The MOSEK Optimization Toolbox for MATLAB Manual, Version 8.1, 2017, http://docs.mosek.com/8.1/toolbox/index.html.
[53] W. M. Moursi and L. Vandenberghe, Douglas-Rachford splitting for the sum of a Lipschitz continuous and a strongly monotone operator, J. Optim. Theory Appl., 183 (2019), pp. 179-198. · Zbl 07112082
[54] R. Nishihara, L. Lessard, B. Recht, A. Packard, and M. Jordan, A general analysis of the convergence of ADMM, in Proceedings of the 32nd International Conference on Machine Learning, Proc. Mach. Learn. Res. 37, PMLR, 2015, pp. 343-352.
[55] G. B. Passty, Ergodic convergence to a zero of the sum of monotone operators in Hilbert space, J. Math. Anal. Appl., 72 (1979), pp. 383-390. · Zbl 0428.47039
[56] D. W. Peaceman and H. H. Rachford, Jr., The numerical solution of parabolic and elliptic differential equations, J. Soc. Indust. Appl. Math., 3 (1955), pp. 28-41, https://doi.org/10.1137/0103003. · Zbl 0067.35801
[57] F. Pedregosa, On the Convergence Rate of the Three Operator Splitting Scheme, preprint, https://arxiv.org/abs/1610.07830, 2016.
[58] F. Pedregosa, K. Fatras, and M. Casotto, Proximal splitting meets variance reduction, in Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, K. Chaudhuri and M. Sugiyama, eds., Proc. Mach. Learn. Res. 89, PMLR, 2019, pp. 1-10.
[59] F. Pedregosa and G. Gidel, Adaptive three operator splitting, in Proceedings of the 35th International Conference on Machine Learning, J. Dy and A. Krause, eds., Proc. Mach. Learn. Res. 80, PMLR, 2018, pp. 4085-4094.
[60] H. Raguet, A note on the forward-Douglas-Rachford splitting for monotone inclusion and convex optimization, Optim. Lett., 13 (2019), pp. 717-740. · Zbl 07078286
[61] H. Raguet, J. Fadili, and G. Peyré, A generalized forward-backward splitting, SIAM J. Imaging Sci., 6 (2013), pp. 1199-1226, https://doi.org/10.1137/120872802. · Zbl 1296.47109
[62] S. Reich, Extension problems for accretive sets in Banach spaces, J. Funct. Anal., 26 (1977), pp. 378-395. · Zbl 0378.47037
[63] S. Reich and S. Simons, Fenchel duality, Fitzpatrick functions and the Kirszbraun-Valentine extension theorem, Proc. Amer. Math. Soc., 133 (2005), pp. 2657-2660. · Zbl 1075.46020
[64] J. Rieger and M. K. Tam, Backward-Forward-Reflected-Backward Splitting for Three Operator Monotone Inclusions, preprint, https://arxiv.org/abs/2001.07327, 2020. · Zbl 07208681
[65] R. T. Rockafellar, Conjugate Duality and Optimization, CBMS-NSF Reg. Conf. Ser. Appl. Math. 16, SIAM, 1974, https://doi.org/10.1137/1.9781611970524. · Zbl 0296.90036
[66] E. K. Ryu and S. Boyd, Primer on monotone operator methods, Appl. Comput. Math., 15 (2016), pp. 3-43. · Zbl 1342.47066
[67] E. K. Ryu, R. Hannah, and W. Yin, Scaled Relative Graph: Nonexpansive Operators via 2D Euclidean Geometry, preprint, https://arxiv.org/abs/1902.09788, 2019.
[68] E. K. Ryu, A. B. Taylor, C. Bergeling, and P. Giselsson, Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection, preprint, https://arxiv.org/abs/1812.00146, 2020.
[69] E. K. Ryu and B. C. Vu͂, Finding the forward-Douglas-Rachford-forward method, J. Optim. Theory Appl., 184 (2020), pp. 858-876. · Zbl 1513.47112
[70] J. H. Seidman, M. Fazlyab, V. M. Preciado, and G. J. Pappas, A control-theoretic approach to analysis and parameter selection of Douglas-Rachford splitting, IEEE Control Systems Lett., 4 (2020), pp. 199-204.
[71] A. Taylor, B. Van Scoy, and L. Lessard, Lyapunov functions for first-order methods: Tight automated convergence guarantees, in Proceedings of the 35th International Conference on Machine Learning, Proc. Mach. Learn. Res. 80, PMLR, 2018, pp. 4897-4906.
[72] A. B. Taylor, J. M. Hendrickx, and F. Glineur, Exact worst-case performance of first-order methods for composite convex optimization, SIAM J. Optim., 27 (2017), pp. 1283-1313, https://doi.org/10.1137/16M108104X. · Zbl 1371.90108
[73] A. B. Taylor, J. M. Hendrickx, and F. Glineur, Performance estimation toolbox (PESTO): Automated worst-case analysis of first-order optimization methods, in Proceedings of the 2017 IEEE 56th Annual Conference on Decision and Control (CDC), IEEE, 2017, pp. 1278-1283. · Zbl 1371.90108
[74] A. B. Taylor, J. M. Hendrickx, and F. Glineur, Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Math. Program., 161 (2017), pp. 307-345. · Zbl 1359.90098
[75] A. B. Taylor, J. M. Hendrickx, and F. Glineur, Exact worst-case convergence rates of the proximal gradient method for composite convex minimization, J. Optim. Theory Appl., 178 (2018), pp. 455-476. · Zbl 1394.90464
[76] A. Teixeira, E. Ghadimi, I. Shames, H. Sandberg, and M. Johansson, Optimal scaling of the ADMM algorithm for distributed quadratic programming, in Proceedings of the 52nd IEEE Conference on Decision and Control, IEEE, 2013, pp. 6868-6873.
[77] A. Teixeira, E. Ghadimi, I. Shames, H. Sandberg, and M. Johansson, The ADMM algorithm for distributed quadratic problems: Parameter selection and constraint preconditioning, IEEE Trans. Signal Process., 64 (2016), pp. 290-305. · Zbl 1412.94093
[78] P. Tseng, Applications of a splitting algorithm to decomposition in convex programming and variational inequalities, SIAM J. Control Optim., 29 (1991), pp. 119-138, https://doi.org/10.1137/0329006. · Zbl 0737.90048
[79] P. Tseng, A modified forward-backward splitting method for maximal monotone mappings, SIAM J. Control Optim., 38 (2000), pp. 431-446, https://doi.org/10.1137/S0363012998338806. · Zbl 0997.90062
[80] F. A. Valentine, On the extension of a vector function so as to preserve a Lipschitz condition, Bull. Amer. Math. Soc., 49 (1943), pp. 100-108. · Zbl 0061.37505
[81] F. A. Valentine, A Lipschitz condition preserving extension for a vector function, Amer. J. Math., 67 (1945), pp. 83-93. · Zbl 0061.37507
[82] B. Van Scoy, R. A. Freeman, and K. M. Lynch, The fastest known globally convergent first-order method for minimizing strongly convex functions, IEEE Control Systems Lett., 2 (2018), pp. 49-54.
[83] H. Wang, M. Fazlyab, S. Chen, and V. M. Preciado, Robust Convergence Analysis of Three-Operator Splitting, preprint, https://arxiv.org/abs/1910.04229, 2019.
[84] X. Wang and L. Yao, Maximally monotone linear subspace extensions of monotone subspaces: Explicit constructions and characterizations, Math. Program., 139 (2013), pp. 327-352. · Zbl 1273.47082
[85] M. Yan, A new primal-dual algorithm for minimizing the sum of three functions with a linear operator, J. Sci. Comput., 76 (2018), pp. 1698-1717. · Zbl 1415.65142
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.