Abstract
Text classification often faces the problem of imbalanced training data. This is true in sentiment analysis and particularly prominent in emotion classification where multiple emotion categories are very likely to produce naturally skewed training data. Different sampling methods have been proposed to improve classification performance by reducing the imbalance ratio between training classes. However, data sparseness and the small disjunct problem remain obstacles in generating new samples for minority classes when the data are skewed and limited. Methods to produce meaningful samples for smaller classes rather than simple duplication are essential in overcoming this problem. In this paper, we present an oversampling method based on word embedding compositionality which produces meaningful balanced training data. We first use a large corpus to train a continuous skip-gram model to form a word embedding model maintaining the syntactic and semantic integrity of the word features. Then, a compositional algorithm based on recursive neural tensor networks is used to construct sentence vectors based on the word embedding model. Finally, we use the SMOTE algorithm as an oversampling method to generate samples for the minority classes and produce a fully balanced training set. Evaluation results on two quite different tasks show that the feature composition method and the oversampling method are both important in obtaining improved classification results. Our method effectively addresses the data imbalance issue and consequently achieves improved results for both sentiment and emotion classification.
Original language | English |
---|---|
Pages (from-to) | 226-240 |
Number of pages | 15 |
Journal | Cognitive Computation |
Volume | 7 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Apr 2015 |
Keywords
- Emotion classification
- Imbalanced training
- Semantic compositionality
- Sentiment analysis
- Word embedding
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition
- Computer Science Applications
- Cognitive Neuroscience