Probing Numerical Concepts in Financial Text with BERT Models

Shanyue Guo, Le Qiu, Emmanuele Chersoni

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Numbers are notoriously an essential component of financial texts, and their correct understanding is key to automatic system for efficiently extracting and processing information.
In our paper, we analyze the embeddings of different BERT-based models, by testing them on supervised and unsupervised probing tasks for financial numeral understanding and value ordering.
Our results show that LMs with different types of training have complementary strengths, thus suggesting that their embeddings should be combined for more stable performances across tasks and categories.
Original languageEnglish
Title of host publicationProceedings of the Joint Workshop of the IJCAI Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen)
EditorsChung-Chi Chen, Tatsuya Ishigaki, Hiroya Takamura, Akihiko Murai, Suzuko Nishino, Hen-Hsen Huang, Hsin-Hsi Chen
PublisherAssociation for Computational Linguistics (ACL)
Pages73-78
Publication statusPublished - Aug 2024
Event Joint Workshop of the 8th Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen): FinNLP-AgentScen - Jeju Convention Center, Jeju Island, Korea, Republic of
Duration: 3 Aug 20243 Aug 2024

Conference

Conference Joint Workshop of the 8th Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen)
Country/TerritoryKorea, Republic of
CityJeju Island
Period3/08/243/08/24

Fingerprint

Dive into the research topics of 'Probing Numerical Concepts in Financial Text with BERT Models'. Together they form a unique fingerprint.

Cite this