Modification Of The Fletcher-Reeves Conjugate Gradient Method In Unconstrained Optimization

المؤلفون

الكلمات المفتاحية:

Optimization، Unconstrained Optimization، Conjugate Gradient، Modified FR

الملخص

Conjugate gradient (CG) methods are a class of unconstrained optimization algorithms with strong local and global convergence qualities and low memory needs. The Hestenes-Stiefel and Polack-Ribier techniques are occasionally more efficient than the Fletcher-Reeves (FR) approach, although it is slower. The numerical performance of the Fletcher-Reeves method is sometimes inconsistent, since when the step (Sk) is small, then, gk+1gk hence dk+1 and dk can be dependent. This paper introduces a modification to the FR method to overcome to this disadvantage. The algorithm uses the strong Wolfe line search conditions. The descent property and global convergence for the method is provided. Numerical results are also reported.

Dimensions

منشور

2022-11-11

كيفية الاقتباس

Khalil K. Abbo, & Naba M. Hasan. (2022). Modification Of The Fletcher-Reeves Conjugate Gradient Method In Unconstrained Optimization . African Journal of Advanced Pure and Applied Sciences (AJAPAS), 1(4), 192–202. استرجع في من https://aaasjournals.com/index.php/ajapas/article/view/169

إصدار

القسم

Articles