Abstract
Medical term normalization consists in mapping a piece of text to a large number of output classes.Given the small size of the annotated datasets and the extremely long tail distribution of the concepts, it is of utmost importance to develop models that are capable to generalize to scarce or unseen concepts.An important attribute of most target ontologies is their hierarchical structure. In this paper we introduce a simple and effective learning strategy that leverages such information to enhance the generalizability of both discriminative and generative models.The evaluation shows that the proposed strategy produces state-of-the-art performance on seen concepts and consistent improvements on unseen ones, allowing also for efficient zero-shot knowledge transfer across text typologies and datasets.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing |
Editors | Yoav Goldberg, Zornitsa Kozareva, Yue Zhang |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 8580–8591 |
DOIs | |
Publication status | Published - Dec 2022 |
Event | Conference on Empirical Methods in Natural Language Processing - Abu Dhabi National Exhibition Centre, Abu Dhabi, United Arab Emirates Duration: 7 Dec 2022 → 11 Dec 2022 https://2022.emnlp.org/ |
Conference
Conference | Conference on Empirical Methods in Natural Language Processing |
---|---|
Abbreviated title | EMNLP |
Country/Territory | United Arab Emirates |
City | Abu Dhabi |
Period | 7/12/22 → 11/12/22 |
Internet address |