Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114870
PIRA download icon_1.1View/Download Full Text
Title: MHNet : multi-view high-order network for diagnosing neurodevelopmental disorders using resting-state fMRI
Authors: Li, Y
Zeng, W
Dong, W
Cai, L
Wang, L
Chen, H
Yan, H
Bian, L
Wang, N 
Issue Date: Oct-2025
Source: Journal of imaging informatics in medicine, Oct. 2025, v. 38, no. 5, p. 2994–3014
Abstract: Deep learning models have shown promise in diagnosing neurodevelopmental disorders (NDD) like ASD and ADHD. However, many models either use graph neural networks (GNN) to construct single-level brain functional networks (BFNs) or employ spatial convolution filtering for local information extraction from rs-fMRI data, often neglecting high-order features crucial for NDD classification. We introduce a Multi-view High-order Network (MHNet) to capture hierarchical and high-order features from multi-view BFNs derived from rs-fMRI data for NDD prediction. MHNet has two branches: the Euclidean Space Features Extraction (ESFE) module and the Non-Euclidean Space Features Extraction (Non-ESFE) module, followed by a Feature Fusion-based Classification (FFC) module for NDD identification. ESFE includes a Functional Connectivity Generation (FCG) module and a High-order Convolutional Neural Network (HCNN) module to extract local and high-order features from BFNs in Euclidean space. Non-ESFE comprises a Generic Internet-like Brain Hierarchical Network Generation (G-IBHN-G) module and a High-order Graph Neural Network (HGNN) module to capture topological and high-order features in non-Euclidean space. Experiments on three public datasets show that MHNet outperforms state-of-the-art methods using both AAL1 and Brainnetome Atlas templates. Extensive ablation studies confirm the superiority of MHNet and the effectiveness of using multi-view fMRI information and high-order features. Our study also offers atlas options for constructing more sophisticated hierarchical networks and explains the association between key brain regions and NDD. MHNet leverages multi-view feature learning from both Euclidean and non-Euclidean spaces, incorporating high-order information from BFNs to enhance NDD classification performance.
Keywords: Convolution neural network
Euclidean space
Graph neural network
High-order
Multi-view
Neurodevelopmental disorder
Non-Euclidean space
rs-fMRI
Publisher: Springer New York LLC
Journal: Journal of imaging informatics in medicine 
ISSN: 2948-2925
EISSN: 2948-2933
DOI: 10.1007/s10278-025-01399-5
Rights: © The Author(s) 2025
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
The following publication Li, Y., Zeng, W., Dong, W. et al. MHNet: Multi-view High-Order Network for Diagnosing Neurodevelopmental Disorders Using Resting-State fMRI. J Digit Imaging. Inform. med. 38, 2994–3014 (2025) is available at https://doi.org/10.1007/s10278-025-01399-5.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
s10278-025-01399-5.pdf3.33 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.