Abstract
Text classification is an important task in natural language processing. Previous text classification models do not perform well on short texts due to the data sparsity problem. In order to solve this problem, recent research extracts concepts of words to enrich text representation. However, this approach might bring general concepts, which might not be helpful in discriminating categories in text classification. Furthermore, it might bring noise into text representation and lead to performance degradation. To tackle these problems, we propose a neural network called DE-CNN, which can incorporate context-relevant concepts into a convolutional neural network for short text classification. Our model firstly utilizes two layers to extract concepts and context respectively and then employs an attention layer to extract context-relevant concepts. Then the concepts are incorporated into text representation for short text classification. The experimental results on three text classification tasks show that our proposed model outperforms compared state-of-the-art models.
Original language | English |
---|---|
Pages (from-to) | 42-53 |
Number of pages | 12 |
Journal | Neurocomputing |
Volume | 386 |
DOIs | |
Publication status | Published - 21 Apr 2020 |
Keywords
- Attention mechanism
- Knowledge base
- Neural networks
- Short text classification
ASJC Scopus subject areas
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence