Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/102025
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorDong, Jen_US
dc.creatorZhang, Qen_US
dc.creatorHuang, Xen_US
dc.creatorDuan, Ken_US
dc.creatorTan, Qen_US
dc.creatorJiang, Zen_US
dc.date.accessioned2023-10-09T06:48:47Z-
dc.date.available2023-10-09T06:48:47Z-
dc.identifier.citationp. 2519-2527-
dc.identifier.isbn978-1-4503-9416-1en_US
dc.identifier.otherp. 2519-2527-
dc.identifier.urihttp://hdl.handle.net/10397/102025-
dc.descriptionWWW '23: The ACM Web Conference 2023, Austin TX, USA, 30 April 2023-4 May 2023en_US
dc.language.isoenen_US
dc.publisherAssociation for Computing Machineryen_US
dc.rights© Copyright held by the owner/author(s). Publication rights licensed to ACM. 2023. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in http://dx.doi.org/10.1145/3543507.3583376.en_US
dc.subjectGraph neural networksen_US
dc.subjectKnowledge graphsen_US
dc.subjectMulti-hop question answeringen_US
dc.titleHierarchy-aware multi-hop question answering over knowledge graphsen_US
dc.typeConference Paperen_US
dc.identifier.spage2519en_US
dc.identifier.epage2527en_US
dc.identifier.doi10.1145/3543507.3583376en_US
dcterms.abstractKnowledge graphs (KGs) have been widely used to enhance complex question answering (QA). To understand complex questions, existing studies employ language models (LMs) to encode contexts. Despite the simplicity, they neglect the latent relational information among question concepts and answers in KGs. While question concepts ubiquitously present hyponymy at the semantic level, e.g., mammals and animals, this feature is identically reflected in the hierarchical relations in KGs, e.g., a_type_of. Therefore, we are motivated to explore comprehensive reasoning by the hierarchical structures in KGs to help understand questions. However, it is non-trivial to reason over tree-like structures compared with chained paths. Moreover, identifying appropriate hierarchies relies on expertise. To this end, we propose HamQA, a novel Hierarchy-aware multi-hop Question Answering framework on knowledge graphs, to effectively align the mutual hierarchical information between question contexts and KGs. The entire learning is conducted in Hyperbolic space, inspired by its advantages of embedding hierarchical structures. Specifically, (i) we design a context-aware graph attentive network to capture context information. (ii) Hierarchical structures are continuously preserved in KGs by minimizing the Hyperbolic geodesic distances. The comprehensive reasoning is conducted to jointly train both components and provide a top-ranked candidate as an optimal answer. We achieve a higher ranking than the state-of-the-art multi-hop baselines on the official OpenBookQA leaderboard with an accuracy of 85%.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationThe ACM Web Conference 2023 : proceedings of the World Wide Web Conference WWW 2023, p. 2519-2527en_US
dcterms.issued2023-04-
dc.relation.conferenceWorld Wide Web Conference [WWW]en_US
dc.description.validate202310 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera2464, a3041-
dc.identifier.SubFormID47740, 49260-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Dong_Hierarchy-Aware_Multi-Hop_Question.pdfPre-Published version2.02 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

126
Citations as of Apr 14, 2025

Downloads

300
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

6
Citations as of Jun 21, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.