Skip to content

Proposal: Adding Differentiable Multi-Objective Optimization (MOO) Test Functions #367

@Souza-DR

Description

@Souza-DR

Hello,

I'm exploring the possibility of contributing a collection of differentiable multi-objective optimization (MOO) test functions to the OptimizationProblems.jl repository. I have personally implemented these functions and their gradients in Julia.

Motivation

My main motivation is to use these problems as a standardized testbed for experiments with multi-objective algorithms I'm currently developing. This suite of functions has already been used in several published works, such as:

  1. Lucambio Pérez, L. R., & Prudente, L. F. (2018). Nonlinear conjugate gradient methods for vector optimization. SIAM Journal on Optimization, 28(3), 2690-2720.

  2. Gonçalves, M. L. N., & Prudente, L. F. (2020). On the extension of the Hager–Zhang conjugate gradient method for vector optimization. Computational Optimization and Applications, 76(3), 889-916.

  3. Assunção, P. B., Ferreira, O. P., & Prudente, L. F. (2021). Conditional gradient method for multiobjective optimization. Computational Optimization and Applications, 78(3), 741-768.

  4. Gonçalves, M. L. N., Lima, F. S., & Prudente, L. F. (2022). A study of Liu-Storey conjugate gradient methods for vector optimization. Applied Mathematics and Computation, 425, 127099.

  5. Gonçalves, M. L. N., Lima, F. S., & Prudente, L. F. (2022). Globally convergent Newton-type methods for multiobjective optimization. Computational Optimization and Applications, 83(2), 403-434.

  6. Bello-Cruz, Y., Melo, J. G., Prudente, L. F., & Serra, R. V. G. (2024). A Proximal Gradient Method with an Explicit Line search for Multiobjective Optimization. arXiv preprint arXiv:2404.10993.

  7. Lapucci, M., & Mansueto, P. (2023). A limited memory Quasi-Newton approach for multi-objective optimization. Computational Optimization and Applications, 85(1), 33-73.

  8. Yang, Y. X., Deng, X., & Tang, L. P. (2025). Global Convergence of a Modified BFGS-Type Method Based on Function Information for Nonconvex Multiobjective Optimization Problems. Journal of the Operations Research Society of China.

  9. Chen, W., Tang, L., & Yang, X. (2025). Improvements to steepest descent method for multi-objective optimization. Numerical Algorithms.

  10. He, Q. R., Li, S. J., Zhang, B. Y., et al. (2024). A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization. Computational Optimization and Applications, 89(3), 805-842.

  11. Prudente, L. F., & Souza, D. R. (2022). A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization. Journal of Optimization Theory and Applications, 194(3), 1107-1140.

  12. Prudente, L. F., & Souza, D. R. (2024). Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems. Computational Optimization and Applications, 88(3), 719-757.

I believe these functions could be useful for researchers working on multi-objective algorithms in Julia and would complement the existing scalar test problems available in the repository.

Open Questions

Before proceeding with a more detailed proposal or implementation, I'd like to ask:

  1. Would a contribution like this align with the goals of the repository?
  2. Would it be acceptable to introduce a new category or model structure for multi-objective problems?
  3. Is there any existing effort or recommendation I should be aware of before structuring the code?

Additional Information

I'd be happy to follow the design patterns of OptimizationProblems.jl, and I'm open to feedback on how best to integrate this kind of functionality.

Looking forward to your thoughts!

Best regards,
Danilo R. Souza

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions