Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/95503
PIRA download icon_1.1View/Download Full Text
Title: JDGAN : Enhancing generator on extremely limited data via joint distribution
Authors: Li, W
Xu, LC 
Liang, ZX 
Wang, SZ
Cao, JN 
Lam, TC 
Cui, XH
Issue Date: 28-Mar-2021
Source: Neurocomputing, 28 Mar. 2021, v. 431, p. 148-162
Abstract: Generative Adversarial Network (GAN) is a thriving generative model and considerable efforts have been made to enhance the generation capabilities via designing a different adversarial framework of GAN (e.g., the discriminator and the generator) or redesigning the penalty function. Although existing models have been demonstrated to be very effective, their generation capabilities have limitations. Existing GAN variants either result in identical generated instances or generate simulation data with low quality when the training data are diverse and extremely limited (a dataset consists of a set of classes but each class holds several or even one single sample) or extremely imbalanced (a category holds a set of samples and other categories hold one single sample). In this paper, we present an innovative approach to tackle this issue, which jointly employs joint distribution and reparameterization method to reparameterize the randomized space as a mixture model and learn the parameters of this mixture model along with that of GAN. In this way, we term our approach Joint Distribution GAN (JDGAN). In our work, we show that the JDGAN can not only generate high quality simulation data with diversity, but also increase the overlapping area between the generating distribution and the raw data distribution. We proceed to conduct extensive experiments, utilizing MNIST, CIFAR10 and Mass Spectrometry datasets, all using extremely limited amounts of data, to demonstrate the significant performance of JDGAN in both achieving the smallest Fréchet Inception Distance (FID) score and producing diverse generated data.
Keywords: Mode collapse
Joint distribution
Reparameterization
GAN
Publisher: Elsevier
Journal: Neurocomputing 
ISSN: 0925-2312
EISSN: 1872-8286
DOI: 10.1016/j.neucom.2020.12.001
Rights: © 2020 Elsevier B.V. All rights reserved
© 2020. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/.
The following publication Li, W., Xu, L., Liang, Z., Wang, S., Cao, J., Lam, T. C., & Cui, X. (2021). JDGAN: Enhancing generator on extremely limited data via joint distribution. Neurocomputing, 431, 148-162 is available at https://doi.org/10.1016/j.neucom.2020.12.001
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Xu_Jdgan_Enhancing_Generator.pdfPre-Published version3.05 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

94
Last Week
0
Last month
Citations as of Sep 22, 2024

Downloads

59
Citations as of Sep 22, 2024

SCOPUSTM   
Citations

9
Citations as of Sep 26, 2024

WEB OF SCIENCETM
Citations

8
Citations as of Sep 26, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.