An optimal parameter for the Generalized Descent Symmetrical
Hestenes-Stiefel algorithm
Abstract
We have proposed a Generalized Descent Symmetrical Hestenes-Stiefel
algorithm [12] , GDSHS for short, which can generate sufficient
descent directions for the objective function. Using the Wolfe line
search conditions, the global convergence property of the method is also
obtained based on the spectral analysis of the conjugate gradient
iteration matrix and the Zoutendijk condition for steepest descent
methods. I propose in this paper a theoretical choice to improve the
performance of the GDSHS algorithm, by the use of an optimal parameter.
Based on this, some descent algorithms are developed. 86 numerical
experiments are presented to verify their performance and the numerical
results show that the new conjugate gradient method GDSHS with the
parameter c=1 , denoted GDSHS1, is competitive with GDSHS
algorithms that have a parameter c chosen in the interval ] 0 ,
+ ∞ [ .