Dai, Yu-Hong; Liao, Li-Zhi; Li, Duan On restart procedures for the conjugate gradient method. (English) Zbl 1137.90669 Numer. Algorithms 35, No. 2-4, 249-260 (2004). Summary: The conjugate gradient method is a powerful solution scheme for solving unconstrained optimization problems, especially for large-scale problems. However, the convergence rate of the method without restart is only linear. In this paper, we will consider an idea contained in [M. J. D. Powell, Restart procedures of the conjugate gradient method, Math. Program. 12, 241–254 (1977; Zbl 0396.90072)] and present a new restart technique for this method. Given an arbitrary descent direction \(d_t\) and the gradient \(g_t\), our key idea is to make use of the BFGS updating formula to provide a symmetric positive definite matrix \(P_t\) such that \(d_t=-P_tg_t\), and then define the conjugate gradient iteration in the transformed space.Two conjugate gradient algorithms are designed based on the new restart technique. Their global convergence is proved under mild assumptions on the objective function. Numerical experiments are also reported, which show that the two algorithms are comparable to the Beale-Powell restart algorithm. Cited in 22 Documents MSC: 90C30 Nonlinear programming 49M37 Numerical methods based on nonlinear programming 65K10 Numerical optimization and variational techniques 90C52 Methods of reduced gradient type Keywords:unconstrained optimization; conjugate gradient method; BFGS updating formula; restart; global convergence Citations:Zbl 0396.90072 Software:minpack PDFBibTeX XMLCite \textit{Y.-H. Dai} et al., Numer. Algorithms 35, No. 2--4, 249--260 (2004; Zbl 1137.90669) Full Text: DOI