Optimization, ( ISI ), Volume (63), No (7), Year (2014-7) , Pages (1027-1042)

Title : ( Two hybrid nonlinear conjugate gradient methods based on a modified secant equation )

Authors: Saman Babaie-Kafaki , Reza Ghanbari ,

Access to full-text not allowed by authors

Citation: BibTeX | EndNote

Abstract

In order to take advantage of the attractive features of the Hestenes–Stiefel and Dai–Yuan conjugate gradient (CG) methods, we suggest two hybridizations of these methods based on Andrei’s approach of hybridizing the CG parameters convexly and Powell’s approach of nonnegative restriction of the CG parameters. The hybridization parameter in our methods is computed from a modified secant equation obtained based on the search direction of the Hager–Zhang nonlinear CG method. We show that if the line search fulfils the Wolfe conditions, then one of our methods is globally convergent for uniformly convex functions and the other is globally convergent for general functions. We report some numerical results demonstrating the efficiency of our methods in the sense of the performance profile introduced by Dolan and More

Keywords

, unconstrained optimization; large, scale optimization; secant equation; conjugate gradient algorithm; global convergence
برای دانلود از شناسه و رمز عبور پرتال پویا استفاده کنید.

@article{paperid:1029843,
author = {Saman Babaie-Kafaki and Ghanbari, Reza},
title = {Two hybrid nonlinear conjugate gradient methods based on a modified secant equation},
journal = {Optimization},
year = {2014},
volume = {63},
number = {7},
month = {July},
issn = {0233-1934},
pages = {1027--1042},
numpages = {15},
keywords = {unconstrained optimization; large-scale optimization; secant equation; conjugate gradient algorithm; global convergence},
}

[Download]

%0 Journal Article
%T Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
%A Saman Babaie-Kafaki
%A Ghanbari, Reza
%J Optimization
%@ 0233-1934
%D 2014

[Download]