Not all arguments are processed equally: a distributional model of argument complexity

Emmanuele Chersoni, Enrico Santus, Alessandro Lenci, Philippe Blache, Chu-ren Huang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)


This work addresses some questions about language processing: what does it mean that natural language sentences are semantically complex? What semantic features can determine different degrees of difficulty for human comprehenders? Our goal is to introduce a framework for argument semantic complexity, in which the processing difficulty depends on the typicality of the arguments in the sentence, that is, their degree of compatibility with the selectional constraints of the predicate. We postulate that complexity depends on the difficulty of building a semantic representation of the event or the situation conveyed by a sentence. This representation can be either retrieved directly from the semantic memory or built dynamically by solving the constraints included in the stored representations. To support this postulation, we built a Distributional Semantic Model to compute a compositional cost function for the sentence unification process. Our evaluation on psycholinguistic datasets reveals that the model is able to account for semantic phenomena such as the context-sensitive update of argument expectations and the processing of logical metonymies.
Original languageEnglish
Pages (from-to)873-900
Number of pages28
JournalLanguage Resources and Evaluation
Issue number4
Early online date3 Mar 2021
Publication statusPublished - Dec 2021


  • Argument complexity
  • Cognitive modeling
  • Distributional semantics
  • Logical metonymy
  • Psycholinguistics

ASJC Scopus subject areas

  • Language and Linguistics
  • Education
  • Linguistics and Language
  • Library and Information Sciences


Dive into the research topics of 'Not all arguments are processed equally: a distributional model of argument complexity'. Together they form a unique fingerprint.

Cite this