Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/97231
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorSuzuki, Aen_US
dc.creatorNitanda, Aen_US
dc.creatorWang, Jen_US
dc.creatorXu, Len_US
dc.creatorYamanishi, Ken_US
dc.creatorCavazza, Men_US
dc.date.accessioned2023-02-21T03:25:47Z-
dc.date.available2023-02-21T03:25:47Z-
dc.identifier.isbn9781713845393en_US
dc.identifier.urihttp://hdl.handle.net/10397/97231-
dc.description35th Conference on Neural Information Processing Systems (NeurIPS 2021), Virtual-only Conference, 6-14 Dec, 2021en_US
dc.language.isoenen_US
dc.publisherNeurIPSen_US
dc.rightsPosted with permission of the author.en_US
dc.titleGeneralization bounds for graph embedding using negative sampling : linear vs hyperbolicen_US
dc.typeConference Paperen_US
dc.identifier.spage1243en_US
dc.identifier.epage1255en_US
dcterms.abstractGraph embedding, which represents real-world entities in a mathematical space, has enabled numerous applications such as analyzing natural languages, social networks, biochemical networks, and knowledge bases.It has been experimentally shown that graph embedding in hyperbolic space can represent hierarchical tree-like data more effectively than embedding in linear space, owing to hyperbolic space's exponential growth property. However, since the theoretical comparison has been limited to ideal noiseless settings, the potential for the hyperbolic space's property to worsen the generalization error for practical data has not been analyzed. In this paper, we provide a generalization error bound applicable for graph embedding both in linear and hyperbolic spaces under various negative sampling settings that appear in graph embedding. Our bound states that error is polynomial and exponential with respect to the embedding space's radius in linear and hyperbolic spaces, respectively, which implies that hyperbolic space's exponential growth property worsens the error. Using our bound, we clarify the data size condition on which graph embedding in hyperbolic space can represent a tree better than in Euclidean space by discussing the bias-variance trade-off. Our bound also shows that imbalanced data distribution, which often appears in graph embedding, can worsen the error.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, & J. Wortman Vaughan (Eds.), Advances in Neural Information Processing Systems 34 (NeurIPS 2021), p. 1243-1255. NeurIPS, 2021en_US
dcterms.issued2021-
dc.relation.ispartofbookAdvances in Neural Information Processing Systems 34 (NeurIPS 2021)en_US
dc.relation.conferenceConference on Neural Information Processing Systems [NeurIPS]en_US
dc.description.validate202302 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera1532-
dc.identifier.SubFormID45353-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextJST, JST-AIP, Daiwa Foundation Award, JST-PRESTOen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCopyright retained by authoren_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
generalization_bounds_for_grap.pdfPre-Published version380.67 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

110
Citations as of Oct 6, 2025

Downloads

38
Citations as of Oct 6, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.