Graph-Structured Context Understanding for Knowledge-grounded Response Generation

Yanran Li, Wenjie Li, Zhitao Wang

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

1 Citation (Scopus)

Abstract

In this work, we establish a context graph from both conversation utterances and external knowledge, and develop a novel graph-based encoder to better understand the conversation context. Specifically, the encoder fuses the information in the context graph stage-by-stage and provides global context-graph-aware representations of each node in the graph to facilitate knowledge-grounded response generation. On a large-scale conversation corpus, we validate the effectiveness of the proposed approach and demonstrate the benefit of knowledge in conversation understanding.

Original languageEnglish
Title of host publicationSIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval
PublisherAssociation for Computing Machinery, Inc
Pages1930-1934
Number of pages5
ISBN (Electronic)9781450380379
DOIs
Publication statusPublished - 11 Jul 2021
Event44th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2021 - Virtual, Online, Canada
Duration: 11 Jul 202115 Jul 2021

Publication series

NameSIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval

Conference

Conference44th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2021
Country/TerritoryCanada
CityVirtual, Online
Period11/07/2115/07/21

Keywords

  • dialogue systems
  • knowledge-grounded response generation

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design
  • Information Systems

Fingerprint

Dive into the research topics of 'Graph-Structured Context Understanding for Knowledge-grounded Response Generation'. Together they form a unique fingerprint.

Cite this