TY - JOUR
T1 - Topic-level knowledge sub-graphs for multi-turn dialogue generation
AU - Li, Jing
AU - Huang, Qingbao
AU - Cai, Yi
AU - Liu, Yongkang
AU - Fu, Mingyi
AU - Li, Qing
N1 - Funding Information:
This work was supported by National Natural Science Foundation of China ( 62076100 ), and the collaborative research grants from the Fundamental Research Funds for the Central Universities, China , SCUT ( D2210010 , D2200150 , and D2201300 ), the Science and Technology Planning Project of Guangdong Province, China ( 2020B0101100002 ), the Science and Technology Programs of Guangzhou, China ( 201704030076 , 201707010223 , 201802010027 , 201902010046 ), the Science and Technology Key Projects of Guangxi Province, China ( 2020AA21077007 ), and the Hong Kong Research Grants Council , China (PolyU1121417, C1031-18G ), and an internal research grant from the Hong Kong Polytechnic University , China ( 1.9B0V ).
Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/12/25
Y1 - 2021/12/25
N2 - Previous multi-turn dialogue approaches based on global Knowledge Graphs (KGs) still suffer from generic, uncontrollable, and incoherent responses generation. Most of them neither consider the local topic-level semantic information of KGs nor effectively merge the information of long dialogue contexts and KGs into the dialogue generation. To tackle these issues, we propose a Topic-level Knowledge-aware Dialogue Generation model to capture context-aware topic-level knowledge information. Our method thus accounts for topic-coherence, fluency, and diversity of generated responses. Specifically, we first decompose the given KG into a set of topic-level sub-graphs, with each sub-graph capturing a semantic component of the input KG. Furthermore, we design a Topic-level Sub-graphs Attention Network to calculate the comprehensive representation of both sub-graphs and previous turns of dialogue utterances, which then decoded with the current turn into a response. By using sub-graphs, our model is able to attend to different topical components of the KG and enhance the topic-coherence. We perform extensive experiments on two datasets of DuRecDial and KdConv to demonstrate the effectiveness of our model. The experimental results demonstrate that our model outperforms existing strong baselines.
AB - Previous multi-turn dialogue approaches based on global Knowledge Graphs (KGs) still suffer from generic, uncontrollable, and incoherent responses generation. Most of them neither consider the local topic-level semantic information of KGs nor effectively merge the information of long dialogue contexts and KGs into the dialogue generation. To tackle these issues, we propose a Topic-level Knowledge-aware Dialogue Generation model to capture context-aware topic-level knowledge information. Our method thus accounts for topic-coherence, fluency, and diversity of generated responses. Specifically, we first decompose the given KG into a set of topic-level sub-graphs, with each sub-graph capturing a semantic component of the input KG. Furthermore, we design a Topic-level Sub-graphs Attention Network to calculate the comprehensive representation of both sub-graphs and previous turns of dialogue utterances, which then decoded with the current turn into a response. By using sub-graphs, our model is able to attend to different topical components of the KG and enhance the topic-coherence. We perform extensive experiments on two datasets of DuRecDial and KdConv to demonstrate the effectiveness of our model. The experimental results demonstrate that our model outperforms existing strong baselines.
KW - Knowledge graph
KW - Knowledge-based dialogue system
KW - Multi-turn dialogue generation
KW - Topic-level
UR - http://www.scopus.com/inward/record.url?scp=85118138033&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2021.107499
DO - 10.1016/j.knosys.2021.107499
M3 - Journal article
AN - SCOPUS:85118138033
SN - 0950-7051
VL - 234
SP - 1
EP - 9
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 107499
ER -