A Simple Approach to Financial Relation Classification with Pre-trained Language Models

Le Qiu, Bo Peng, Yu Yin Hsu, Emmanuele Chersoni

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

The paper serves as an experimental report submitted to the KDF.SIGIR 2023 shared task on relation extraction, focusing on the REFinD dataset. Motivated by recent advancements on Pre-trained Language Models (PLMs), we propose a simple, yet effective approach that leverages popular PLMs such as BERT, and RoBERTa to address this challenge. The approach capitalizes on the inherent capabilities of PLMs to encode sequences and enrich the semantics of the representations at the entity level.We go beyond the lexical and semantic levels by incorporating supplementary information to tackle the challenges in this task of financial relation classification. In the paper, we detail and justify the approach and report the results of our ablation studies.
Original languageEnglish
Title of host publicationProceedings of the 4th Workshop on Knowledge Discovery from Unstructured Data in Financial Services
PublisherAssociation for Computing Machinery
Publication statusPublished - Jul 2023
EventThe 4th Workshop on Knowledge Discovery from Unstructured Data in Financial Services - Taipei International Convention Center, Taipei, Taiwan
Duration: 27 Jul 202327 Jul 2023
https://kdf-workshop.github.io/kdf23/

Workshop

WorkshopThe 4th Workshop on Knowledge Discovery from Unstructured Data in Financial Services
Abbreviated titleSIGIR 23 KDF
Country/TerritoryTaiwan
CityTaipei
Period27/07/2327/07/23
Internet address

Keywords

  • financial relation extraction
  • relation classification
  • shortest dependency path (SDP)

Fingerprint

Dive into the research topics of 'A Simple Approach to Financial Relation Classification with Pre-trained Language Models'. Together they form a unique fingerprint.

Cite this