Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/116122
| Title: | Pretrain-finetune neural operator for multi-fidelity surrogate modeling of structural dynamic systems | Authors: | Li, PL Liu, SF Ni, YQ Ling, JM Wang, YW |
Issue Date: | 15-Nov-2025 | Source: | Engineering structures, 15 Nov. 2025, v. 343, pt. D, 121218 | Abstract: | The integration of machine learning algorithms for structural dynamics modeling has gained wide attention, which plays an important role in the real-time calculation of structural digital twins to ensure structural safety. This paper proposes a Pretrain-Finetune Neural Operator (PF-NO) using multi-fidelity data to serve as the surrogate model for structural systems with the variability of physical parameters and excitation. In this framework, the pre-training stage is the knowledge-learning stage. The structural dynamics equations are embedded as a physics loss, and the Newmark-β numerical integration algorithm is applied to generate low-fidelity data for training. Then the pretrained model would be fine-tuned by high-fidelity data such as measured data to further improve prediction accuracy. In addition, for the multi-degree-of-freedom structural systems, the modal factor is introduced to simplify the vibration equation, therefore the PF-NO model training is not limited by the modal order with higher flexibility. Two linear system examples are provided, showing that the PF-NO model can achieve higher precision than the classical numerical method and the state-of-the-art deep learning model with superior extrapolation capability, indicating its strong potential for more complex engineering applications. | Keywords: | Neural operator Structural dynamics Surrogate model |
Publisher: | Elsevier Ltd | Journal: | Engineering structures | ISSN: | 0141-0296 | EISSN: | 1873-7323 | DOI: | 10.1016/j.engstruct.2025.121218 |
| Appears in Collections: | Journal/Magazine Article |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



