Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/109062
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Data Science and Artificial Intelligence | en_US |
dc.creator | Zhang, S | en_US |
dc.creator | Yang, Q | en_US |
dc.creator | Ma, C | en_US |
dc.creator | Wu, J | en_US |
dc.creator | Li, H | en_US |
dc.creator | Tan, KC | en_US |
dc.date.accessioned | 2024-09-17T03:06:44Z | - |
dc.date.available | 2024-09-17T03:06:44Z | - |
dc.identifier.isbn | 1-57735-887-2 | en_US |
dc.identifier.isbn | 978-1-57735-887-9 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/109062 | - |
dc.description | Thirty-Eighth AAAI Conference on Artificial Intelligence, February 20–27, 2024, Vancouver, Canada | en_US |
dc.language.iso | en | en_US |
dc.publisher | Association for the Advancement of Artificial Intelligence | en_US |
dc.rights | Copyright © 2024, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. | en_US |
dc.rights | The following publication Zhang, S., Yang, Q., Ma, C., Wu, J., Li, H., & Tan, K. C. (2024, March). Tc-lif: A two-compartment spiking neuron model for long-term sequential modelling. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, No. 15, pp. 16838-16847) is available at https://ojs.aaai.org/index.php/AAAI/article/view/29625. | en_US |
dc.title | TC-LIF : a two-compartment spiking neuron model for long-term sequential modelling | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 16838 | en_US |
dc.identifier.epage | 16847 | en_US |
dc.identifier.doi | 10.1609/aaai.v38i15.29625 | en_US |
dcterms.abstract | The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by un-related events that separate useful cues by long delays. As a result, it remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues. To address this challenge, we propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF. The proposed model incorporates carefully designed so-matic and dendritic compartments that are tailored to facilitate learning long-term temporal dependencies. Furthermore, a theoretical analysis is provided to validate the effectiveness of TC-LIF in propagating error gradients over an extended temporal duration. Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, and high energy efficiency of the proposed TC-LIF model. Therefore, this work opens up a myriad of opportunities for solving challenging temporal processing tasks on emerging neuromorphic computing systems. Our code is publicly available at https://github.com/ZhangShimin1/TC-LIF. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In M Wooldridge, J Dy, & S Natarajan (Eds.), Proceedings of the 38th AAAI Conference on Artificial Intelligence, p. 16838-16847. Washington, DC: Association for the Advancement of Artificial Intelligence, 2024 | en_US |
dcterms.issued | 2024 | - |
dc.relation.conference | Conference on Artificial Intelligence [AAAI] | en_US |
dc.description.validate | 202409 bcch | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | a2887b | - |
dc.identifier.SubFormID | 48653 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | National Natural Science Foundation of China | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | VoR allowed | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
29625-Article Text-33679-1-2-20240324.pdf | 482.06 kB | Adobe PDF | View/Open |
Page views
23
Citations as of Sep 22, 2024
Downloads
8
Citations as of Sep 22, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.