Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/112028
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.contributorResearch Institute for Smart Energyen_US
dc.creatorCai, Yen_US
dc.creatorChen, Gen_US
dc.creatorQiao, Zen_US
dc.date.accessioned2025-03-27T03:12:26Z-
dc.date.available2025-03-27T03:12:26Z-
dc.identifier.issn0893-6080en_US
dc.identifier.urihttp://hdl.handle.net/10397/112028-
dc.language.isoenen_US
dc.publisherPergamon Pressen_US
dc.rights© 2025 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Cai, Y., Chen, G., & Qiao, Z. (2025). Neural networks trained by weight permutation are universal approximators. Neural Networks, 187, 107277 is available at https://dx.doi.org/10.1016/j.neunet.2025.107277.en_US
dc.subjectLearning behavioren_US
dc.subjectNeural networksen_US
dc.subjectTraining algorithmen_US
dc.subjectUniversal approximation propertyen_US
dc.titleNeural networks trained by weight permutation are universal approximatorsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume187en_US
dc.identifier.doi10.1016/j.neunet.2025.107277en_US
dcterms.abstractThe universal approximation property is fundamental to the success of neural networks, and has traditionally been achieved by training networks without any constraints on their parameters. However, recent experimental research proposed a novel permutation-based training method, which exhibited a desired classification performance without modifying the exact weight values. In this paper, we provide a theoretical guarantee of this permutation training method by proving its ability to guide a ReLU network to approximate one-dimensional continuous functions. Our numerical results further validate this method's efficiency in regression tasks with various initializations. The notable observations during weight permutation suggest that permutation training can provide an innovative tool for describing network learning behavior.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationNeural networks, July 2025, v. 187, 107277en_US
dcterms.isPartOfNeural networksen_US
dcterms.issued2025-07-
dc.identifier.scopus2-s2.0-86000725968-
dc.identifier.artn107277en_US
dc.description.validate202503 bchyen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_TA, a3885b-
dc.identifier.SubFormID51539-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextCAS AMSS-PolyU Joint Laboratory of Applied Mathematics, Hong Kong; National Natural Science Foundation of China; Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.description.TAElsevier (2025)en_US
dc.description.oaCategoryTAen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
1-s2.0-S089360802500156X-main.pdf2.26 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

7
Citations as of Apr 14, 2025

Downloads

3
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

1
Citations as of Dec 19, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.