Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107481
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorCai, S-
dc.creatorZhang, R-
dc.creatorZhang, M-
dc.creatorWu, J-
dc.creatorLi, H-
dc.date.accessioned2024-06-27T01:33:42Z-
dc.date.available2024-06-27T01:33:42Z-
dc.identifier.issn2379-8920-
dc.identifier.urihttp://hdl.handle.net/10397/107481-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.subjectAuditory attentionen_US
dc.subjectAuditory systemen_US
dc.subjectBrain modelingen_US
dc.subjectConvolutionen_US
dc.subjectConvolutional neural networksen_US
dc.subjectEEGen_US
dc.subjectElectroencephalographyen_US
dc.subjectFeature extractionen_US
dc.subjectGraph convolutional networken_US
dc.subjectNeuronsen_US
dc.subjectSpiking neural networken_US
dc.titleEEG-based auditory attention detection with spiking graph convolutional networken_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.doi10.1109/TCDS.2024.3376433-
dcterms.abstractDecoding auditory attention from brain activities, such as electroencephalography (EEG), sheds light on solving the machine cocktail party problem. However, effective representation of EEG signals remains a challenge. One of the reasons is that the current feature extraction techniques have not fully exploited the spatial information along the EEG signals. EEG signals reflect the collective dynamics of brain activities across different regions. The intricate interactions among these channels, rather than individual EEG channels alone, reflect the distinctive features of brain activities. In this study, we propose a spiking graph convolutional network, called SGCN, which captures the spatial features of multi-channel EEG in a biologically plausible manner. Comprehensive experiments were conducted on two publicly available datasets. Results demonstrate that the proposed SGCN achieves competitive auditory attention detection (AAD) performance in low-latency and low-density EEG settings. As it features low power consumption, the SGCN has the potential for practical implementation in intelligent hearing aids and other BCIs.-
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationIEEE transactions on cognitive and developmental systems, Date of Publication: 12 March 2024, Early Access, https://doi.org/10.1109/TCDS.2024.3376433-
dcterms.isPartOfIEEE transactions on cognitive and developmental systems-
dcterms.issued2024-
dc.identifier.scopus2-s2.0-85187988182-
dc.identifier.eissn2379-8939-
dc.description.validate202406 bcch-
dc.identifier.FolderNumbera2887en_US
dc.identifier.SubFormID48652en_US
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusEarly releaseen_US
dc.date.embargo0000-00-00 (to be updated)en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

1
Citations as of Jun 30, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.