Abstract
While neural embeddings represent a popular choice for word representation in a wide variety of NLP tasks, their usage for thematic fit modeling has been limited, as they have been reported to lag behind syntax-based count models. In this paper, we propose a complete evaluation of count models and word embeddings on thematic fit estimation, by taking into account a larger number of parameters
and verb roles and introducing also dependency-based embeddings in the comparison. Our results show a complex scenario, where a determinant factor for the performance seems to be the availability to the model of reliable syntactic information for building the distributional representations of the roles.
and verb roles and introducing also dependency-based embeddings in the comparison. Our results show a complex scenario, where a determinant factor for the performance seems to be the availability to the model of reliable syntactic information for building the distributional representations of the roles.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the Twelfth Language Resources and Evaluation Conference |
| Editors | Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis |
| Publisher | European Language Resources Association (ELRA) |
| Pages | 5708-5713 |
| Publication status | Published - May 2020 |
| Event | 12th International Conference on Language Resources and Evaluation, LREC 2020 - Marseille, France Duration: 11 May 2020 → 16 May 2020 |
Conference
| Conference | 12th International Conference on Language Resources and Evaluation, LREC 2020 |
|---|---|
| Country/Territory | France |
| City | Marseille |
| Period | 11/05/20 → 16/05/20 |
Fingerprint
Dive into the research topics of 'Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver