Title : ( A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update )
Authors: Saman Babaie-Kafaki , Reza Ghanbari ,Access to full-text not allowed by authors
Abstract
Hybridizing the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Dai and Liao based on the scaled memoryless BFGS update, a one–parameter class of four–term conjugate gradient methods is proposed. It is shown that the suggested class of conjugate gradient methods possesses the sufficient descent property, without convexity assumption on the objective function. A brief global convergence analysis is made for uniformly convex objective functions. Results of numerical comparisons are reported. They demonstrate efficiency of a method of the proposed class in the sense of the Dolan–Moré performance profile.
Keywords
, Unconstrained optimization, large–scale optimization, conjugate gradient method, sufficient descent property, BFGS update, global convergence.@article{paperid:1066990,
author = {Saman Babaie-Kafaki and Ghanbari, Reza},
title = {A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update},
journal = {Journal of Industrial and Management Optimization},
year = {2017},
volume = {13},
number = {2},
month = {April},
issn = {1547-5816},
pages = {649--658},
numpages = {9},
keywords = {Unconstrained optimization; large–scale optimization; conjugate gradient method; sufficient descent property; BFGS update; global convergence.},
}
%0 Journal Article
%T A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update
%A Saman Babaie-Kafaki
%A Ghanbari, Reza
%J Journal of Industrial and Management Optimization
%@ 1547-5816
%D 2017