TY - GEN
T1 - A part-of-speech enhanced neural conversation model
AU - Luo, Chuwei
AU - Li, Wenjie
AU - Chen, Qiang
AU - He, Yanxiang
PY - 2017/1/1
Y1 - 2017/1/1
N2 - Modeling syntactic information of sentences is essential for neural response generation models to produce appropriate response sentences of high linguistic quality. However, no previous work in conversational responses generation using sequence-to-sequence (Seq2Seq) neural network models has reported to take the sentence syntactic information into account. In this paper, we present two part-of-speech (POS) enhanced models that incorporate the POS information into the Seq2Seq neural conversation model. When training these models, corresponding POS tag is attached to each word in the post and the response so that the word sequences and the POS tag sequences can be interrelated. By the time the word in a response is to be generated, it is constrained by the expected POS tag. The experimental results show that the POS enhanced Seq2Seq models can generate more grammatically correct and appropriate responses in terms of both perplexity and BLEU measures when compared with the word Seq2Seq model.
AB - Modeling syntactic information of sentences is essential for neural response generation models to produce appropriate response sentences of high linguistic quality. However, no previous work in conversational responses generation using sequence-to-sequence (Seq2Seq) neural network models has reported to take the sentence syntactic information into account. In this paper, we present two part-of-speech (POS) enhanced models that incorporate the POS information into the Seq2Seq neural conversation model. When training these models, corresponding POS tag is attached to each word in the post and the response so that the word sequences and the POS tag sequences can be interrelated. By the time the word in a response is to be generated, it is constrained by the expected POS tag. The experimental results show that the POS enhanced Seq2Seq models can generate more grammatically correct and appropriate responses in terms of both perplexity and BLEU measures when compared with the word Seq2Seq model.
KW - Response generation
KW - Seq2Seq neural conversation model
KW - Syntactic information incorporating
UR - http://www.scopus.com/inward/record.url?scp=85018703270&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-56608-5_14
DO - 10.1007/978-3-319-56608-5_14
M3 - Conference article published in proceeding or book
SN - 9783319566078
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 173
EP - 185
BT - Advances in Information Retrieval - 39th European Conference on IR Research, ECIR 2017, Proceedings
PB - Springer Verlag
T2 - 39th European Conference on Information Retrieval, ECIR 2017
Y2 - 8 April 2017 through 13 April 2017
ER -