A LINEARLY CONVERGENT CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS
SUN MIN, LIU JING
In this paper, a new conjugate gradient method with a simple formula $\beta_k$ is given for unconstrained optimization problems. It possesses sufficient descent property for any line search. Global convergence results of the new method with the Goldstein line search and the Armijo line search are discussed. For uniformly convex functions, the method has linear convergence rate. Preliminary computational experiments are included to illustrate the efficiency of the proposed method in minimizing large-scale non-convex optimization problems.