DESCENT MODIFIED CONJUGATE GRADIENT METHODS FOR VECTOR OPTIMIZATION PROBLEMS

Authors

  • Jamilu Yahaya Ahmadu Bello University Zaria, Nigeria
  • Ibrahim Arzuka Department of mathematics Bauchi state university Gadau, Nigeria.
  • Mustapha Isyaku Department of Mathematics, Federal University Dutsinmma, Nigeria.

DOI:

https://doi.org/10.58715/bangmodjmcs.2023.9.6

Keywords:

Conjugate gradient method, sufficient descent condition, vector optimization

Abstract

Scalarization approaches transform vector optimization problems (VOPs) into single-objective optimization. These approaches are quite elegant; however, they suffer from the drawback of necessitating the assignment of weights to prioritize specific objective functions. In contrast, the conjugate gradient (CG) algorithm provides an attractive alternative that does not require the conversion of any objective function or assignment of weights. Nevertheless, the set of Pareto-optimal solutions is obtainable. We introduce three CG techniques for solving VOPs by modifying their search directions. We consider modifying the search directions of the Fletcher-Reeves (FR), conjugate descent (CD), and Dai-Yuan (DY) CG techniques to obtain their descent property without the use of any line search, as well as to achieve good convergence properties. Numerical experiments are conducted to demonstrate the implementation and efficiency of the proposed techniques.

Downloads

Download data is not yet available.

Downloads

Published

2023-12-31

How to Cite

Yahaya, J., Arzuka, I., & Isyaku, M. (2023). DESCENT MODIFIED CONJUGATE GRADIENT METHODS FOR VECTOR OPTIMIZATION PROBLEMS. Bangmod International Journal of Mathematical and Computational Science, 9, 72–91. https://doi.org/10.58715/bangmodjmcs.2023.9.6

Issue

Section

Research Article