Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107739
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorSaxena, Den_US
dc.creatorCao, Jen_US
dc.creatorXu, Jen_US
dc.creatorKulshrestha, Ten_US
dc.date.accessioned2024-07-10T06:20:53Z-
dc.date.available2024-07-10T06:20:53Z-
dc.identifier.isbn1-57735-887-2en_US
dc.identifier.isbn978-1-57735-887-9en_US
dc.identifier.urihttp://hdl.handle.net/10397/107739-
dc.descriptionThe Thirty-Eighth AAAI Conference on Artificial Intelligence, February 20–27, 2024, Vancouver, Canadaen_US
dc.language.isoenen_US
dc.publisherAAAI Pressen_US
dc.rightsCopyright © 2024, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.en_US
dc.rightsPosted with permission of the publisher.en_US
dc.rightsThis is the author's manuscript of the following paper: Saxena, D., Cao, J., Xu, J., & Kulshrestha, T. (2024). RG-GAN: Dynamic Regenerative Pruning for Data-Efficient Generative Adversarial Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(5), 4704-4712, which is available at https://doi.org/10.1609/aaai.v38i5.28271.en_US
dc.titleRG-GAN : dynamic regenerative pruning for data-efficient generative adversarial networksen_US
dc.typeConference Paperen_US
dc.identifier.spage4704en_US
dc.identifier.epage4712en_US
dc.identifier.doi10.1609/aaai.v38i5.28271en_US
dcterms.abstractTraining Generative Adversarial Networks (GAN) to generate high-quality images typically requires large datasets. Network pruning during training has recently emerged as a significant advancement for data-efficient GAN. However, simple and straightforward pruning can lead to the risk of losing key information, resulting in suboptimal results due to GAN’s competitive dynamics between generator (G) and discriminator (D). Addressing this, we present RG-GAN, a novel approach that marks the first incorporation of dynamic weight regeneration and pruning in GAN training to improve the quality of the generated samples, even with limited data. Specifically, RG-GAN initiates layer-wise dynamic pruning by removing less important weights to the quality of the generated images. While pruning enhances efficiency, excessive sparsity within layers can pose a risk of model collapse. To mitigate this issue, RG-GAN applies a dynamic regeneration method to reintroduce specific weights when they become important, ensuring a balance between sparsity and image quality. Though effective, the sparse network achieved through this process might eliminate some weights important to the combined G and D performance, a crucial aspect for achieving stable and effective GAN training. RG-GAN addresses this loss of weights by integrating learned sparse network weights back into the dense network at the previous stage during a follow-up regeneration step. Our results consistently demonstrate RG-GAN’s robust performance across a variety of scenarios, including different GAN architectures, datasets, and degrees of data scarcity, reinforcing its value as a generic training methodology. Results also show that data augmentation exhibits improved performance in conjunction with RG-GAN. Furthermore, RG-GAN can achieve fewer parameters without compromising, and even enhancing, the quality of the generated samples. Code can be found at this link: https://github.com/IntellicentAI-Lab/RG-GANen_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of the 38th AAAI Conference on Artificial Intelligence, p. 4704-4712en_US
dcterms.issued2024-
dc.identifier.scopus2-s2.0-85189495316-
dc.relation.conferenceConference on Artificial Intelligence [AAAI]en_US
dc.description.validate202407 bcwhen_US
dc.description.oaNot applicableen_US
dc.identifier.FolderNumbera2979-
dc.identifier.SubFormID49002-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryPublisher permissionen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
RG_GAN.pdf1.41 MBAdobe PDFView/Open
Open Access Information
Status open access
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

79
Citations as of Apr 14, 2025

Downloads

62
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

4
Citations as of May 29, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.