Abstract
With the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures. In this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the Third Workshop on Economics and Natural Language Processing |
| Editors | Udo Hahn, Veronique Hoste, Amanda Stent |
| Publisher | Association for Computational Linguistics, ACL Anthology |
| Pages | 37–44 |
| Number of pages | 8 |
| ISBN (Electronic) | 978-1-954085-84-8 |
| DOIs | |
| Publication status | Published - Nov 2021 |
| Event | Workshop on Economics and Natural Language Processing - Barcelo Bavaro Convention Center, Punta Cana, Dominican Republic Duration: 11 Nov 2021 → 11 Nov 2021 |
Workshop
| Workshop | Workshop on Economics and Natural Language Processing |
|---|---|
| Country/Territory | Dominican Republic |
| City | Punta Cana |
| Period | 11/11/21 → 11/11/21 |