Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

With the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures. In this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective.
Original languageEnglish
Title of host publicationProceedings of the Third Workshop on Economics and Natural Language Processing
EditorsUdo Hahn, Veronique Hoste, Amanda Stent
PublisherAssociation for Computational Linguistics, ACL Anthology
Pages37–44
Number of pages8
ISBN (Electronic)978-1-954085-84-8
DOIs
Publication statusPublished - Nov 2021
EventWorkshop on Economics and Natural Language Processing - Barcelo Bavaro Convention Center, Punta Cana, Dominican Republic
Duration: 11 Nov 202111 Nov 2021

Workshop

WorkshopWorkshop on Economics and Natural Language Processing
Country/TerritoryDominican Republic
CityPunta Cana
Period11/11/2111/11/21

Fingerprint

Dive into the research topics of 'Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks'. Together they form a unique fingerprint.

Cite this