Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/101341
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorZhou, Zen_US
dc.creatorShi, Jen_US
dc.creatorYang, Ren_US
dc.creatorZou, Yen_US
dc.creatorLI, Qingen_US
dc.date.accessioned2023-09-05T08:44:55Z-
dc.date.available2023-09-05T08:44:55Z-
dc.identifier.issn2640-3498en_US
dc.identifier.urihttp://hdl.handle.net/10397/101341-
dc.description40th International Conference on Machine Learning, 23-29 July 2023, Honolulu, Hawaii, USAen_US
dc.language.isoenen_US
dc.publisherPMLR web siteen_US
dc.rightsCopyright 2023 by the author(s).en_US
dc.rightsPosted with permission of the author.en_US
dc.rightsThe following publication Zhou, Z., Shi, J., Yang, R., Zou, Y., & Li, Q. (2023). SlotGAT: Slot-based Message Passing for Heterogeneous Graphs. Proceedings of Machine Learning Research, 202, 42644-42657 is available at https://proceedings.mlr.press/v202/zhou23j.html.en_US
dc.titleSlotGAT : slot-based message passing for heterogeneous graphsen_US
dc.typeConference Paperen_US
dc.identifier.spage42644en_US
dc.identifier.epage42657en_US
dc.identifier.volume202en_US
dcterms.abstractHeterogeneous graphs are ubiquitous to model complex data. There are urgent needs on powerful heterogeneous graph neural networks to effectively support important applications. We identify a potential semantic mixing issue in existing message passing processes, where the representations of the neighbors of a node v are forced to be transformed to the feature space of v for aggregation, though the neighbors are in different types. That is, the semantics in different node types are entangled together into node v’s representation. To address the issue, we propose SlotGAT with separate message passing processes in slots, one for each node type, to maintain the representations in their own node-type feature spaces. Moreover, in a slot-based message passing layer, we design an attention mechanism for effective slot-wise message aggregation. Further, we develop a slot attention technique after the last layer of SlotGAT, to learn the importance of different slots in downstream tasks. Our analysis indicates that the slots in SlotGAT can preserve different semantics in various feature spaces. The superiority of SlotGAT is evaluated against 13 baselines on 6 datasets for node classification and link prediction. Our code is at https://github.com/scottjiao/SlotGAT_ICML23/.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of Machine Learning Research, 2023, v. 202, p. 42644-42657en_US
dcterms.isPartOfProceedings of Machine Learning Researchen_US
dcterms.issued2023-
dc.relation.conferenceInternational Conference on Machine Learning [ICML]en_US
dc.publisher.placeHonolulu, Hawaii, USAen_US
dc.description.validate202309 bckwen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumbera2248-
dc.identifier.SubFormID47216-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; Tencent Technology (Shenzhen) Co., Ltd; Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCopyright retained by authoren_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
zhou23j.pdf1.62 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

171
Citations as of May 11, 2025

Downloads

50
Citations as of May 11, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.