Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107877
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorWang, Ren_US
dc.creatorLi, Jen_US
dc.creatorLi, Pen_US
dc.date.accessioned2024-07-15T07:55:29Z-
dc.date.available2024-07-15T07:55:29Z-
dc.identifier.isbn979-8-89176-061-5en_US
dc.identifier.urihttp://hdl.handle.net/10397/107877-
dc.descriptionThe 2023 Conference on Empirical Methods in Natural Language Processing, December 6-10, 2023, Singaporeen_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2023 Association for Computational Linguisticsen_US
dc.rightsMaterials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Renzhi Wang, Jing Li, and Piji Li. 2023. InfoDiffusion: Information Entropy Aware Diffusion Process for Non-Autoregressive Text Generation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13757–13770, Singapore. Association for Computational Linguistics is available at https://aclanthology.org/2023.findings-emnlp.919/.en_US
dc.titleInfoDiffusion : information entropy aware diffusion process for non-autoregressive text generationen_US
dc.typeConference Paperen_US
dc.identifier.spage13757en_US
dc.identifier.epage13770en_US
dcterms.abstractDiffusion models have garnered considerable interest in the field of text generation. Several studies have explored text diffusion models with different structures and applied them to various tasks, including named entity recognition and summarization. However, there exists a notable disparity between the “easy-first” text generation process of current diffusion models and the “keyword-first” natural text generation process of humans, which has received limited attention. To bridge this gap, we propose InfoDiffusion, a non-autoregressive text diffusion model. Our approach introduces a “keyinfo-first” generation strategy and incorporates a noise schedule based on the amount of text information. In addition, InfoDiffusion combines self-conditioning with a newly proposed partially noising model structure. Experimental results show that InfoDiffusion outperforms the baseline model in terms of generation quality and diversity, as well as exhibiting higher sampling efficiency.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Findings of the Association for Computational Linguistics: EMNLP 2023 , p. 13757–13770, Singapore. Association for Computational Linguistics, 2023en_US
dcterms.issued2023-
dc.relation.conferenceConference on Empirical Methods in Natural Language Processing [EMNLP]en_US
dc.description.validate202407 bcwhen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumbera3033-
dc.identifier.SubFormID49241-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of Chinaen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
2023.findings-emnlp.919.pdf450.29 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

7
Citations as of Jul 21, 2024

Downloads

1
Citations as of Jul 21, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.