Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105570
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorWang, Zen_US
dc.creatorLi, Wen_US
dc.date.accessioned2024-04-15T07:35:06Z-
dc.date.available2024-04-15T07:35:06Z-
dc.identifier.isbn978-0-9992411-4-1 (Online)en_US
dc.identifier.urihttp://hdl.handle.net/10397/105570-
dc.language.isoenen_US
dc.publisherInternational Joint Conferences on Artificial Intelligenceen_US
dc.rightsCopyright © 2019 International Joint Conferences on Artificial Intelligenceen_US
dc.rightsAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.en_US
dc.rightsPosted with permission of the IJCAI Organization (https://www.ijcai.org/).en_US
dc.rightsThe following publication Wang, Z., & Li, W. (2019, August). Hierarchical Diffusion Attention Network. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence Macao, 10-16 August 2019, p. 3828-3834. IJCAL, 2019 is available at https://doi.org/10.24963/ijcai.2019/531.en_US
dc.titleHierarchical diffusion attention networken_US
dc.typeConference Paperen_US
dc.identifier.spage3828en_US
dc.identifier.epage3834en_US
dc.identifier.doi10.24963/ijcai.2019/531en_US
dcterms.abstractA series of recent studies formulated the diffusion prediction problem as a sequence prediction task and proposed several sequential models based on recurrent neural networks. However, non-sequential properties exist in real diffusion cascades, which do not strictly follow the sequential assumptions of previous work. In this paper, we propose a hierarchical diffusion attention network (HiDAN), which adopts a non-sequential framework and two-level attention mechanisms, for diffusion prediction. At the user level, a dependency attention mechanism is proposed to dynamically capture historical user-to-user dependencies and extract the dependency-aware user information. At the cascade (i.e., sequence) level, a time-aware influence attention is designed to infer possible future user's dependencies on historical users by considering both inherent user importance and time decay effects. Significantly higher effectiveness and efficiency of HiDAN over state-of-the-art sequential models are demonstrated when evaluated on three real diffusion datasets. The further case studies illustrate that HiDAN can accurately capture diffusion dependencies.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, Macao, 10-16 August 2019, p. 3828-3834en_US
dcterms.issued2019-
dc.identifier.scopus2-s2.0-85074950142-
dc.relation.conferenceInternational Joint Conference on Artificial Intelligence [IJCAI]en_US
dc.description.validate202402 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCOMP-0535-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS20004077-
dc.description.oaCategoryPublisher permissionen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
0531.pdf605.88 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

52
Citations as of Apr 14, 2025

Downloads

21
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

26
Citations as of Jun 12, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.