Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/112028
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Applied Mathematics | en_US |
| dc.contributor | Research Institute for Smart Energy | en_US |
| dc.creator | Cai, Y | en_US |
| dc.creator | Chen, G | en_US |
| dc.creator | Qiao, Z | en_US |
| dc.date.accessioned | 2025-03-27T03:12:26Z | - |
| dc.date.available | 2025-03-27T03:12:26Z | - |
| dc.identifier.issn | 0893-6080 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/112028 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Pergamon Press | en_US |
| dc.rights | © 2025 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). | en_US |
| dc.rights | The following publication Cai, Y., Chen, G., & Qiao, Z. (2025). Neural networks trained by weight permutation are universal approximators. Neural Networks, 187, 107277 is available at https://dx.doi.org/10.1016/j.neunet.2025.107277. | en_US |
| dc.subject | Learning behavior | en_US |
| dc.subject | Neural networks | en_US |
| dc.subject | Training algorithm | en_US |
| dc.subject | Universal approximation property | en_US |
| dc.title | Neural networks trained by weight permutation are universal approximators | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 187 | en_US |
| dc.identifier.doi | 10.1016/j.neunet.2025.107277 | en_US |
| dcterms.abstract | The universal approximation property is fundamental to the success of neural networks, and has traditionally been achieved by training networks without any constraints on their parameters. However, recent experimental research proposed a novel permutation-based training method, which exhibited a desired classification performance without modifying the exact weight values. In this paper, we provide a theoretical guarantee of this permutation training method by proving its ability to guide a ReLU network to approximate one-dimensional continuous functions. Our numerical results further validate this method's efficiency in regression tasks with various initializations. The notable observations during weight permutation suggest that permutation training can provide an innovative tool for describing network learning behavior. | en_US |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Neural networks, July 2025, v. 187, 107277 | en_US |
| dcterms.isPartOf | Neural networks | en_US |
| dcterms.issued | 2025-07 | - |
| dc.identifier.scopus | 2-s2.0-86000725968 | - |
| dc.identifier.artn | 107277 | en_US |
| dc.description.validate | 202503 bchy | en_US |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | OA_TA, a3885b | - |
| dc.identifier.SubFormID | 51539 | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | CAS AMSS-PolyU Joint Laboratory of Applied Mathematics, Hong Kong; National Natural Science Foundation of China; Hong Kong Polytechnic University | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.TA | Elsevier (2025) | en_US |
| dc.description.oaCategory | TA | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 1-s2.0-S089360802500156X-main.pdf | 2.26 MB | Adobe PDF | View/Open |
Page views
7
Citations as of Apr 14, 2025
Downloads
3
Citations as of Apr 14, 2025
SCOPUSTM
Citations
1
Citations as of Dec 19, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



