Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109227
PIRA download icon_1.1View/Download Full Text
Title: Adaptive spatiotemporal neural networks through complementary hybridization
Authors: Wu, Y 
Shi, B
Zheng, Z
Zheng, H
Yu, F
Liu, X
Luo, G
Deng, L
Issue Date: 2024
Source: Nature communications, 2024, v. 15, 7355
Abstract: Processing spatiotemporal data sources with both high spatial dimension and rich temporal information is a ubiquitous need in machine intelligence. Recurrent neural networks in the machine learning domain and bio-inspired spiking neural networks in the neuromorphic computing domain are two promising candidate models for dealing with spatiotemporal data via extrinsic dynamics and intrinsic dynamics, respectively. Nevertheless, these networks have disparate modeling paradigms, which leads to different performance results, making it hard for them to cover diverse data sources and performance requirements in practice. Constructing a unified modeling framework that can effectively and adaptively process variable spatiotemporal data in different situations remains quite challenging. In this work, we propose hybrid spatiotemporal neural networks created by combining the recurrent neural networks and spiking neural networks under a unified surrogate gradient learning framework and a Hessian-aware neuron selection method. By flexibly tuning the ratio between two types of neurons, the hybrid model demonstrates better adaptive ability in balancing different performance metrics, including accuracy, robustness, and efficiency on several typical benchmarks, and generally outperforms conventional single-paradigm recurrent neural networks and spiking neural networks. Furthermore, we evidence the great potential of the proposed network with a robotic task in varying environments. With our proof of concept, the proposed hybrid model provides a generic modeling route to process spatiotemporal data sources in the open world.
Publisher: Nature Publishing Group
Journal: Nature communications 
EISSN: 2041-1723
DOI: 10.1038/s41467-024-51641-x
Rights: © The Author(s) 2024
This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
The following publication Wu, Y., Shi, B., Zheng, Z. et al. Adaptive spatiotemporal neural networks through complementary hybridization. Nat Commun 15, 7355 (2024) is available at https://doi.org/10.1038/s41467-024-51641-x.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
s41467-024-51641-x.pdf2.54 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

19
Citations as of Nov 24, 2024

Downloads

12
Citations as of Nov 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.