DESCENT MODIFIED CONJUGATE GRADIENT METHODS FOR VECTOR OPTIMIZATION PROBLEMS
DOI:
https://doi.org/10.58715/bangmodjmcs.2023.9.6Keywords:
Conjugate gradient method, sufficient descent condition, vector optimizationAbstract
Scalarization approaches transform vector optimization problems (VOPs) into single-objective optimization. These approaches are quite elegant; however, they suffer from the drawback of necessitating the assignment of weights to prioritize specific objective functions. In contrast, the conjugate gradient (CG) algorithm provides an attractive alternative that does not require the conversion of any objective function or assignment of weights. Nevertheless, the set of Pareto-optimal solutions is obtainable. We introduce three CG techniques for solving VOPs by modifying their search directions. We consider modifying the search directions of the Fletcher-Reeves (FR), conjugate descent (CD), and Dai-Yuan (DY) CG techniques to obtain their descent property without the use of any line search, as well as to achieve good convergence properties. Numerical experiments are conducted to demonstrate the implementation and efficiency of the proposed techniques.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Bangmod International Journal of Mathematical and Computational Science
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.