Designing a Uniform Meaning Representation for Natural Language Processing

Van Gysel Jens E. L. , Meagan Vigus, Jayeol Chun, Kenneth Lai, Sarah Moeller, Jiarui Yao, Tim O’Gorman, Andrew Cowell, William Croft, Chu-ren Huang, Jan Hajič, James H. Martin, Stephan Oepen, Martha Palmer, James Pustejovsky, Rosa Vallejos, Nianwen Xue

Research output: Journal article publicationJournal articleAcademic researchpeer-review


In this paper we present Uniform Meaning Representation (UMR), a meaning representation designed to annotate the semantic content of a text. UMR is primarily based on Abstract Meaning Representation (AMR), an annotation framework initially designed for English, but also draws from other meaning representations. UMR extends AMR to other languages, particularly morphologically complex, low-resource languages. UMR also adds features to AMR that are critical to semantic interpretation and enhances AMR by proposing a companion document-level representation that captures linguistic phenomena such as coreference as well as temporal and modal dependencies that potentially go beyond sentence boundaries.
Original languageEnglish
Pages (from-to)343-360
JournalKI - Kunstliche Intelligenz
Publication statusPublished - 30 Apr 2021


  • Natural language processing
  • Meaning representation


Dive into the research topics of 'Designing a Uniform Meaning Representation for Natural Language Processing'. Together they form a unique fingerprint.

Cite this