×

New stepsizes for the gradient method. (English) Zbl 1459.90224

Summary: Gradient methods are famous for their simplicity and low complexity, which attract more and more attention for large scale optimization problems. A good stepsize plays an important role to construct an efficient gradient method. This paper proposes a new framework to generate stepsizes for gradient methods applied to convex quadratic function minimization problems. By adopting different criterions, we propose four new gradient methods. For 2-dimensional unconstrained problems with convex quadratic objective functions, we prove that the new methods either terminate in finite iterations or converge \(R\)-superlinearly; for \(n\)-dimensional problems, we prove that all the new methods converge \(R\)-linearly. Numerical experiments show that the new methods enjoy lower complexity and outperform the existing gradient methods.

MSC:

90C52 Methods of reduced gradient type
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Barzilai, J.; Borwein, JM, Two point step size gradient methods, IMA J. Numer. Anal., 8, 141-148 (1988) · Zbl 0638.65055 · doi:10.1093/imanum/8.1.141
[2] Cauchy, A., Méthode généerale pour la résolution des systems d’équations simultanées, Comp. Rend. Sci. Paris, 25, 46-89 (1847)
[3] Curry, HB, The method of steepest descent for nonlinear minimization problems, Q. Appl. Math., 2, 258-261 (1944) · Zbl 0061.26801 · doi:10.1090/qam/10667
[4] Dai, Y-H, Alternate step gradient method, Optimization, 52, 4-5, 395-415 (2003) · Zbl 1056.65055 · doi:10.1080/02331930310001611547
[5] Dai, Y-H, A new analysis on the Barzilai-Borwein gradient method, J. Oper. Res. Soc. China, 1, 2, 187-198 (2013) · Zbl 1334.90162 · doi:10.1007/s40305-013-0007-x
[6] Dai, Y-H; Fletcher, R., On the asymptotic behaviour of some new gradient methods, Math. Program., 103, 3, 541-559 (2005) · Zbl 1099.90038 · doi:10.1007/s10107-004-0516-9
[7] Dai, Y-H; Yuan, Y-X, Analysis of monotone gradient methods, J. Ind. Manag. Optim., 1, 181-192 (2005) · Zbl 1071.65084
[8] Dai, Y-H; Liao, L-Z, \(R\)-linear convergence of the Barzilai and Borwein gradient method, IMA J. Numer. Anal., 22, 1-10 (2002) · Zbl 1002.65069
[9] De Asmundis, R.; di Serafino, D.; Riccio, F.; Toraldo, G., On spectral properties of steepest descent methods, IMA J. Numer. Anal., 33, 4, 1416-1435 (2013) · Zbl 1321.65095 · doi:10.1093/imanum/drs056
[10] De Asmundis, R.; di Serafino, D.; Hager, WW; Toraldo, G.; Zhang, H-C, An efficient gradient method using the Yuan steplength, Comp. Opt. Appl., 59, 3, 541-563 (2014) · Zbl 1310.90082 · doi:10.1007/s10589-014-9669-5
[11] de Klerk, E.; Glineur, F.; Taylor, AB, On the worst-case complexity of the gradient method with exact linesearch for smooth strongly convex functions, Optim. Lett., 11, 1185-1199 (2017) · Zbl 1381.90067 · doi:10.1007/s11590-016-1087-4
[12] di Serafino, D.; Ruggiero, V.; Toraldo, G.; Zanni, L., On the steplength selection in gradient methods for unconstrained optimization, Appl. Math. Comput., 318, 176-195 (2018) · Zbl 1426.65082
[13] Fletcher, R., A limited memory steepest descent method, Math. Program. Ser. A, 135, 413-436 (2012) · Zbl 1254.90113 · doi:10.1007/s10107-011-0479-6
[14] Frassoldati, G.; Zanni, L.; Zanghirati, G., New adaptive stepsize selections in gradient methods, J. Ind. Manag. Optim., 4, 2, 299-312 (2008) · Zbl 1161.90524
[15] Friedlander, A.; Martinez, JM; Molina, B.; Raydan, M., Gradient method with retards and generalizations, SIAM J. Numer. Anal., 36, 275-289 (1999) · Zbl 0940.65032 · doi:10.1137/S003614299427315X
[16] Gonzaga, C.; Schneider, RM, On the steepest descent algorithm for quadratic functions, Comput. Optim. Appl., 63, 2, 523-542 (2016) · Zbl 1360.90183 · doi:10.1007/s10589-015-9775-z
[17] Nocedal, J.; Sartenaer, A.; Zhu, C., On the behavior of the gradient norm in the steepest descent method, Comput. Optim. Appl., 22, 1, 5-35 (2002) · Zbl 1008.90057 · doi:10.1023/A:1014897230089
[18] Raydan, M., The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM J. Optim., 7, 26-33 (1997) · Zbl 0898.90119 · doi:10.1137/S1052623494266365
[19] Raydan, M.; Svaiter, BF, Relaxed steepest descent and Cauchy-Barzilai-Borwein method, Comput. Optim. Appl., 21, 155-167 (2002) · Zbl 0988.90049 · doi:10.1023/A:1013708715892
[20] Vrahatis, MN; Androulakis, GS; Lambrinos, JN; Magoulas, GD, A class of gradient unconstrained minimization algorithms with adaptive step-size, J. Comput. Appl. Math., 114, 2, 367-386 (2000) · Zbl 0958.65072 · doi:10.1016/S0377-0427(99)00276-9
[21] Yuan, Y-X, A new stepsize for the steepest descent method, J. Comput. Math., 24, 2, 149-156 (2006) · Zbl 1101.65067
[22] Yuan, Y-X, Step-sizes for the gradient method, AMS IP Stud. Adv. Math., 42, 2, 785-796 (2008) · Zbl 1172.90509
[23] Yuan, Y-X, A short note on the Q-linear convergence of the steepest descent method, Math. Program., 123, 339-343 (2010) · Zbl 1189.90164 · doi:10.1007/s10107-009-0267-8
[24] Yuan, Y-X; Wang, Y.; Yang, C.; Yagola, AG, Gradient methods for large scale convex quadratic functions, Optimization and regularization for computational inverse problems and applications, 141-155 (2010), Berlin: Springer, Berlin · Zbl 1277.90089
[25] Zheng, Y.; Zheng, B., A new modified Barzilai-Borwein gradient method for the quadratic minimization problem, J. Optim. Theory Appl., 172, 1, 179-186 (2017) · Zbl 1358.65039 · doi:10.1007/s10957-016-1008-9
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.