Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/107739
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Computing | en_US |
dc.creator | Saxena, D | en_US |
dc.creator | Cao, J | en_US |
dc.creator | Xu, J | en_US |
dc.creator | Kulshrestha, T | en_US |
dc.date.accessioned | 2024-07-10T06:20:53Z | - |
dc.date.available | 2024-07-10T06:20:53Z | - |
dc.identifier.isbn | 1-57735-887-2 | en_US |
dc.identifier.isbn | 978-1-57735-887-9 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/107739 | - |
dc.description | The Thirty-Eighth AAAI Conference on Artificial Intelligence, February 20–27, 2024, Vancouver, Canada | en_US |
dc.language.iso | en | en_US |
dc.publisher | AAAI Press | en_US |
dc.rights | Copyright © 2024, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. | en_US |
dc.rights | Posted with permission of the publisher. | en_US |
dc.rights | This is the author's manuscript of the following paper: Saxena, D., Cao, J., Xu, J., & Kulshrestha, T. (2024). RG-GAN: Dynamic Regenerative Pruning for Data-Efficient Generative Adversarial Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(5), 4704-4712, which is available at https://doi.org/10.1609/aaai.v38i5.28271. | en_US |
dc.title | RG-GAN : dynamic regenerative pruning for data-efficient generative adversarial networks | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 4704 | en_US |
dc.identifier.epage | 4712 | en_US |
dc.identifier.doi | 10.1609/aaai.v38i5.28271 | en_US |
dcterms.abstract | Training Generative Adversarial Networks (GAN) to generate high-quality images typically requires large datasets. Network pruning during training has recently emerged as a significant advancement for data-efficient GAN. However, simple and straightforward pruning can lead to the risk of losing key information, resulting in suboptimal results due to GAN’s competitive dynamics between generator (G) and discriminator (D). Addressing this, we present RG-GAN, a novel approach that marks the first incorporation of dynamic weight regeneration and pruning in GAN training to improve the quality of the generated samples, even with limited data. Specifically, RG-GAN initiates layer-wise dynamic pruning by removing less important weights to the quality of the generated images. While pruning enhances efficiency, excessive sparsity within layers can pose a risk of model collapse. To mitigate this issue, RG-GAN applies a dynamic regeneration method to reintroduce specific weights when they become important, ensuring a balance between sparsity and image quality. Though effective, the sparse network achieved through this process might eliminate some weights important to the combined G and D performance, a crucial aspect for achieving stable and effective GAN training. RG-GAN addresses this loss of weights by integrating learned sparse network weights back into the dense network at the previous stage during a follow-up regeneration step. Our results consistently demonstrate RG-GAN’s robust performance across a variety of scenarios, including different GAN architectures, datasets, and degrees of data scarcity, reinforcing its value as a generic training methodology. Results also show that data augmentation exhibits improved performance in conjunction with RG-GAN. Furthermore, RG-GAN can achieve fewer parameters without compromising, and even enhancing, the quality of the generated samples. Code can be found at this link: https://github.com/IntellicentAI-Lab/RG-GAN | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Proceedings of the 38th AAAI Conference on Artificial Intelligence, p. 4704-4712 | en_US |
dcterms.issued | 2024 | - |
dc.identifier.scopus | 2-s2.0-85189495316 | - |
dc.relation.conference | Conference on Artificial Intelligence [AAAI] | en_US |
dc.description.validate | 202407 bcwh | en_US |
dc.description.oa | Not applicable | en_US |
dc.identifier.FolderNumber | a2979 | - |
dc.identifier.SubFormID | 49002 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | Publisher permission | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
RG_GAN.pdf | 1.41 MB | Adobe PDF | View/Open |
Page views
79
Citations as of Apr 14, 2025
Downloads
62
Citations as of Apr 14, 2025
SCOPUSTM
Citations
4
Citations as of May 29, 2025

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.