Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/113678
Title: | Autonomous multi-objective optimization using large language model | Authors: | Huang, Y Wu, S Zhang, W Wu, J Feng, L Tan, KC |
Issue Date: | 2025 | Source: | IEEE transactions on evolutionary computation, Date of Publication: 15 April 2025, Early Access, https://doi.org/10.1109/TEVC.2025.3561001 | Abstract: | Multi-objective optimization problems (MOPs) are ubiquitous in real-world applications, presenting a complex challenge of balancing multiple conflicting objectives. Traditional multi-objective evolutionary algorithms (MOEAs), though effective, often rely on domain-specific expertise for improved optimization performance, hindering adaptability to unseen MOPs. In recent years, the Large Language Models (LLMs) has revolutionized software engineering by enabling the autonomous generation and refinement of programs. Leveraging this breakthrough, we propose a new LLM-based framework that autonomously designs MOEAs for solving MOPs. The proposed framework includes a robust testing module to refine the generated MOEA through error-driven dialogue with LLMs, a dynamic selection strategy along with informative prompting-based crossover and mutation to fit textual optimization pipeline. Our approach facilitates the design of MOEA without the extensive demands for expert intervention, thereby speeding up the innovation of MOEA. Empirical studies across various MOP categories validate the robustness and superior performance of our proposed framework. | Keywords: | Automatic Algorithm Design Large Language Mode Multi-objective Optimization |
Publisher: | Institute of Electrical and Electronics Engineers | Journal: | IEEE transactions on evolutionary computation | ISSN: | 1089-778X | EISSN: | 1941-0026 | DOI: | 10.1109/TEVC.2025.3561001 |
Appears in Collections: | Journal/Magazine Article |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.