Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/104395
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Industrial and Systems Engineeringen_US
dc.creatorFan, Wen_US
dc.creatorWang, HMen_US
dc.creatorXing, Yen_US
dc.creatorHuang, Ren_US
dc.creatorIp, WHen_US
dc.creatorYung, KLen_US
dc.date.accessioned2024-02-05T08:49:28Z-
dc.date.available2024-02-05T08:49:28Z-
dc.identifier.issn1432-7643en_US
dc.identifier.urihttp://hdl.handle.net/10397/104395-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© Springer-Verlag GmbH Germany, part of Springer Nature 2019en_US
dc.rightsThis version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s00500-019-04451-z.en_US
dc.subjectEdge networken_US
dc.subjectEdge representation vectorsen_US
dc.subjectNetwork representation learningen_US
dc.subjectNode representation vectorsen_US
dc.titleA network representation method based on edge information extractionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage8223en_US
dc.identifier.epage8231en_US
dc.identifier.volume24en_US
dc.identifier.issue11en_US
dc.identifier.doi10.1007/s00500-019-04451-zen_US
dcterms.abstractIn recent years, network representation learning has attracted extensive attention in the academic field due to its significant application potential. However, most of the methods cannot explore edge information in the network deeply, resulting in poor performance at downstream tasks such as classification, clustering and link prediction. In order to solve this problem, we propose a novel way to extract network information. First, the original network is transformed into an edge network with structure and edge information. Then, edge representation vectors can be obtained directly by using an existing network representation model with edge network as its input. Node representation vectors can also be obtained by utilizing the relationships between edges and nodes. Compared with the structure of original network, the edge network is denser, which can help solving the problems caused by sparseness. Extensive experiments on several real-world networks demonstrate that edge network outperforms original network in various graph mining tasks, i.e., node classification and node clustering.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationSoft computing, June 2020, v. 24, no. 11, p. 8223-8231en_US
dcterms.isPartOfSoft computingen_US
dcterms.issued2020-06-
dc.identifier.scopus2-s2.0-85075381546-
dc.identifier.eissn1433-7479en_US
dc.description.validate202402 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumberISE-0303-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; Fundamental Research Funds for the Central Universities of Civil Aviation University of China; Scientific Research Foundation of Civil Aviation University of China; The Hong Kong Polytechnic Universityen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS56392061-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Ip_Network_Representation_Method.pdfPre-Published version1.21 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

65
Citations as of May 11, 2025

Downloads

22
Citations as of May 11, 2025

SCOPUSTM   
Citations

2
Citations as of May 15, 2025

WEB OF SCIENCETM
Citations

2
Citations as of May 15, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.