JDGAN: Enhancing generator on extremely limited data via joint distribution

Wei Li, Linchuan Xu, Zhixuan Liang, Senzhang Wang, Jiannong Cao, Thomas C. Lam, Xiaohui Cui

Research output: Journal article publicationJournal articleAcademic researchpeer-review

9 Citations (Scopus)


Generative Adversarial Network (GAN) is a thriving generative model and considerable efforts have been made to enhance the generation capabilities via designing a different adversarial framework of GAN (e.g., the discriminator and the generator) or redesigning the penalty function. Although existing models have been demonstrated to be very effective, their generation capabilities have limitations. Existing GAN variants either result in identical generated instances or generate simulation data with low quality when the training data are diverse and extremely limited (a dataset consists of a set of classes but each class holds several or even one single sample) or extremely imbalanced (a category holds a set of samples and other categories hold one single sample). In this paper, we present an innovative approach to tackle this issue, which jointly employs joint distribution and reparameterization method to reparameterize the randomized space as a mixture model and learn the parameters of this mixture model along with that of GAN. In this way, we term our approach Joint Distribution GAN (JDGAN). In our work, we show that the JDGAN can not only generate high quality simulation data with diversity, but also increase the overlapping area between the generating distribution and the raw data distribution. We proceed to conduct extensive experiments, utilizing MNIST, CIFAR10 and Mass Spectrometry datasets, all using extremely limited amounts of data, to demonstrate the significant performance of JDGAN in both achieving the smallest Fréchet Inception Distance (FID) score and producing diverse generated data.

Original languageEnglish
Pages (from-to)148-162
Number of pages15
Early online date16 Dec 2020
Publication statusPublished - 28 Mar 2021


  • GAN
  • Joint distribution
  • Mode collapse
  • Reparameterization

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'JDGAN: Enhancing generator on extremely limited data via joint distribution'. Together they form a unique fingerprint.

Cite this