Li, Distant Supervision for Neural Relation Extraction Integrated with Word Attention and Property Features

Jianfeng Qu, Dantong Ouyang, Wen Hua, Yuxin Ye, Ximing Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

39 Citations (Scopus)

Abstract

Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines.

Original languageEnglish
Pages (from-to)59-69
Number of pages11
JournalNeural Networks
Volume100
DOIs
Publication statusPublished - Apr 2018
Externally publishedYes

Keywords

  • Distant supervision
  • Neural relation extraction
  • Sentence encoding
  • Supplementary feature
  • Word- level attention

ASJC Scopus subject areas

  • Artificial Intelligence
  • Information Systems

Fingerprint

Dive into the research topics of 'Li, Distant Supervision for Neural Relation Extraction Integrated with Word Attention and Property Features'. Together they form a unique fingerprint.

Cite this