Incorporating context-relevant concepts into convolutional neural networks for short text classification

Jingyun Xu, Yi Cai, Xin Wu, Xue Lei, Qingbao Huang, Ho fung Leung, Qing Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

6 Citations (Scopus)

Abstract

Text classification is an important task in natural language processing. Previous text classification models do not perform well on short texts due to the data sparsity problem. In order to solve this problem, recent research extracts concepts of words to enrich text representation. However, this approach might bring general concepts, which might not be helpful in discriminating categories in text classification. Furthermore, it might bring noise into text representation and lead to performance degradation. To tackle these problems, we propose a neural network called DE-CNN, which can incorporate context-relevant concepts into a convolutional neural network for short text classification. Our model firstly utilizes two layers to extract concepts and context respectively and then employs an attention layer to extract context-relevant concepts. Then the concepts are incorporated into text representation for short text classification. The experimental results on three text classification tasks show that our proposed model outperforms compared state-of-the-art models.

Original languageEnglish
Pages (from-to)42-53
Number of pages12
JournalNeurocomputing
Volume386
DOIs
Publication statusPublished - 21 Apr 2020

Keywords

  • Attention mechanism
  • Knowledge base
  • Neural networks
  • Short text classification

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this