Improving Speech Emotion Recognition with Adversarial Data Augmentation Network

Research output: Journal article publicationJournal articleAcademic researchpeer-review

5 Citations (Scopus)


When training data are scarce, it is challenging to train a deep neural network without causing the overfitting problem. For overcoming this challenge, this article proposes a new data augmentation network-namely adversarial data augmentation network (ADAN)-based on generative adversarial networks (GANs). The ADAN consists of a GAN, an autoencoder, and an auxiliary classifier. These networks are trained adversarially to synthesize class-dependent feature vectors in both the latent space and the original feature space, which can be augmented to the real training data for training classifiers. Instead of using the conventional cross-entropy loss for adversarial training, the Wasserstein divergence is used in an attempt to produce high-quality synthetic samples. The proposed networks were applied to speech emotion recognition using EmoDB and IEMOCAP as the evaluation data sets. It was found that by forcing the synthetic latent vectors and the real latent vectors to share a common representation, the gradient vanishing problem can be largely alleviated. Also, results show that the augmented data generated by the proposed networks are rich in emotion information. Thus, the resulting emotion classifiers are competitive with state-of-The-Art speech emotion recognition systems.

Original languageEnglish
Pages (from-to)172-184
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number1
Publication statusPublished - 1 Jan 2022


  • Data augmentation
  • generative adversarial networks (GANs)
  • speech emotion recognition
  • Wasserstein divergence

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Improving Speech Emotion Recognition with Adversarial Data Augmentation Network'. Together they form a unique fingerprint.

Cite this