A structured distributional model of sentence meaning and processing

E. Chersoni, E. Santus, L. Pannitto, A. Lenci, P. Blache, C. R. Huang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

13 Citations (Scopus)

Abstract

Most compositional distributional semantic models represent sentence meaning with a single vector. In this paper, we propose a structured distributional model (SDM) that combines word embeddings with formal semantics and is based on the assumption that sentences represent events and situations. The semantic representation of a sentence is a formal structure derived from discourse representation theory and containing distributional vectors. This structure is dynamically and incrementally built by integrating knowledge about events and their typical participants, as they are activated by lexical items. Event knowledge is modelled as a graph extracted from parsed corpora and encoding roles and relationships between participants that are represented as distributional vectors. SDM is grounded on extensive psycholinguistic research showing that generalized knowledge about events stored in semantic memory plays a key role in sentence comprehension. We evaluate SDMon two recently introduced compositionality data sets, and our results show that combining a simple compositionalmodel with event knowledge constantly improves performances, even with dif ferent types of word embeddings.

Original languageEnglish
Pages (from-to)483-502
Number of pages20
JournalNatural Language Engineering
Volume25
Issue number4
DOIs
Publication statusPublished - 31 Jul 2019

Keywords

  • discourse representation theory
  • distributional semantics
  • event knowledge
  • sentence processing
  • word embeddings

ASJC Scopus subject areas

  • Software
  • Language and Linguistics
  • Linguistics and Language
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A structured distributional model of sentence meaning and processing'. Together they form a unique fingerprint.

Cite this