Modification Of The Fletcher-Reeves Conjugate Gradient Method In Unconstrained Optimization
الكلمات المفتاحية:
Optimization، Unconstrained Optimization، Conjugate Gradient، Modified FRالملخص
Conjugate gradient (CG) methods are a class of unconstrained optimization algorithms with strong local and global convergence qualities and low memory needs. The Hestenes-Stiefel and Polack-Ribier techniques are occasionally more efficient than the Fletcher-Reeves (FR) approach, although it is slower. The numerical performance of the Fletcher-Reeves method is sometimes inconsistent, since when the step (Sk) is small, then, gk+1≈ gk hence dk+1 and dk can be dependent. This paper introduces a modification to the FR method to overcome to this disadvantage. The algorithm uses the strong Wolfe line search conditions. The descent property and global convergence for the method is provided. Numerical results are also reported.