TY - GEN
T1 - A Distant Supervised Relation Extraction Model with Two Denoising Strategies
AU - Zhou, Zikai
AU - Cai, Yi
AU - Xu, Jingyun
AU - Xie, Jiayuan
AU - Li, Qing
AU - Xie, Haoran
PY - 2019/7
Y1 - 2019/7
N2 - Distant supervised relation extraction has been an effective way to find relational facts from text. However, distant supervised method inevitably accompanies with wrongly labeled sentences. Noisy sentences lead to poor performance of relation extraction models. Though existing piecewise convolutional neural network model with sentence-level attention (PCNN+ATT) is an effective way to reduce the effect of noisy sentences, it still has two limitations. On one hand, it adopts a PCNN module as sentence encoder, which only captures local contextual features of words and might lose important information. On the other hand, it neglects the fact that not all words contribute equally to the semantics of sentences. To address these two issues, we propose a hierarchical attention-based bidirectional GRU (HA-BiGRU) model. For the first limitation, our model utilizes a BiGRU module in place of PCNN, so as to extract global contextual information. For the second limitation, our model combines word-level and sentence-level attention mechanisms, which help get accurate sentence representations. To further alleviate the wrongly labeling problem, we first calculate the co-occurrence probabilities (CP) between the shortest dependency path (SDP) and the relation labels. Based on these co-occurrence probabilities, two denoising strategies are proposed to reduce noise interference respectively from aspect of filtering labeled data and integrating CP information into model. Experimental results on the corpus of Freebase and New York Times (Freebase+NYT) show that the HA-BiGRU model outperforms baseline models, and the two co-occurrence probabilities based denoising strategies can improve robustness of HA-BiGRU model.
AB - Distant supervised relation extraction has been an effective way to find relational facts from text. However, distant supervised method inevitably accompanies with wrongly labeled sentences. Noisy sentences lead to poor performance of relation extraction models. Though existing piecewise convolutional neural network model with sentence-level attention (PCNN+ATT) is an effective way to reduce the effect of noisy sentences, it still has two limitations. On one hand, it adopts a PCNN module as sentence encoder, which only captures local contextual features of words and might lose important information. On the other hand, it neglects the fact that not all words contribute equally to the semantics of sentences. To address these two issues, we propose a hierarchical attention-based bidirectional GRU (HA-BiGRU) model. For the first limitation, our model utilizes a BiGRU module in place of PCNN, so as to extract global contextual information. For the second limitation, our model combines word-level and sentence-level attention mechanisms, which help get accurate sentence representations. To further alleviate the wrongly labeling problem, we first calculate the co-occurrence probabilities (CP) between the shortest dependency path (SDP) and the relation labels. Based on these co-occurrence probabilities, two denoising strategies are proposed to reduce noise interference respectively from aspect of filtering labeled data and integrating CP information into model. Experimental results on the corpus of Freebase and New York Times (Freebase+NYT) show that the HA-BiGRU model outperforms baseline models, and the two co-occurrence probabilities based denoising strategies can improve robustness of HA-BiGRU model.
UR - http://www.scopus.com/inward/record.url?scp=85073193270&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2019.8852378
DO - 10.1109/IJCNN.2019.8852378
M3 - Conference article published in proceeding or book
AN - SCOPUS:85073193270
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2019 International Joint Conference on Neural Networks, IJCNN 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 International Joint Conference on Neural Networks, IJCNN 2019
Y2 - 14 July 2019 through 19 July 2019
ER -