Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109227
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorWu, Yen_US
dc.creatorShi, Ben_US
dc.creatorZheng, Zen_US
dc.creatorZheng, Hen_US
dc.creatorYu, Fen_US
dc.creatorLiu, Xen_US
dc.creatorLuo, Gen_US
dc.creatorDeng, Len_US
dc.date.accessioned2024-10-03T08:15:05Z-
dc.date.available2024-10-03T08:15:05Z-
dc.identifier.urihttp://hdl.handle.net/10397/109227-
dc.language.isoenen_US
dc.publisherNature Publishing Groupen_US
dc.rights© The Author(s) 2024en_US
dc.rightsThis article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.en_US
dc.rightsThe following publication Wu, Y., Shi, B., Zheng, Z. et al. Adaptive spatiotemporal neural networks through complementary hybridization. Nat Commun 15, 7355 (2024) is available at https://doi.org/10.1038/s41467-024-51641-x.en_US
dc.titleAdaptive spatiotemporal neural networks through complementary hybridizationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume15en_US
dc.identifier.doi10.1038/s41467-024-51641-xen_US
dcterms.abstractProcessing spatiotemporal data sources with both high spatial dimension and rich temporal information is a ubiquitous need in machine intelligence. Recurrent neural networks in the machine learning domain and bio-inspired spiking neural networks in the neuromorphic computing domain are two promising candidate models for dealing with spatiotemporal data via extrinsic dynamics and intrinsic dynamics, respectively. Nevertheless, these networks have disparate modeling paradigms, which leads to different performance results, making it hard for them to cover diverse data sources and performance requirements in practice. Constructing a unified modeling framework that can effectively and adaptively process variable spatiotemporal data in different situations remains quite challenging. In this work, we propose hybrid spatiotemporal neural networks created by combining the recurrent neural networks and spiking neural networks under a unified surrogate gradient learning framework and a Hessian-aware neuron selection method. By flexibly tuning the ratio between two types of neurons, the hybrid model demonstrates better adaptive ability in balancing different performance metrics, including accuracy, robustness, and efficiency on several typical benchmarks, and generally outperforms conventional single-paradigm recurrent neural networks and spiking neural networks. Furthermore, we evidence the great potential of the proposed network with a robotic task in varying environments. With our proof of concept, the proposed hybrid model provides a generic modeling route to process spatiotemporal data sources in the open world.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationNature communications, 2024, v. 15, 7355en_US
dcterms.isPartOfNature communicationsen_US
dcterms.issued2024-
dc.identifier.scopus2-s2.0-85202159550-
dc.identifier.pmid39191782-
dc.identifier.eissn2041-1723en_US
dc.identifier.artn7355en_US
dc.description.validate202410 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Others-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; STI 2030– Major Projects; CETC Haikang Group-Brain Inspired Computing Joint Research Center; Hong Kong Polytechnic University; Chinese Institute for Brain Research, Beijingen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
s41467-024-51641-x.pdf2.54 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

19
Citations as of Nov 24, 2024

Downloads

12
Citations as of Nov 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.