Download PDFOpen PDF in browser

The Mask One at a Time Framework for Detecting the Relationship Between Financial Entities

EasyChair Preprint 10463

4 pagesDate: June 28, 2023

Abstract

In the financial domain, understanding the relationship between two entities helps in understanding financial texts. In this paper, we introduce the Mask One At a Time (MOAT) framework for detecting the relationship between financial entities. Subsequently, we benchmark its performance with the existing state-of-the-art discriminative and generative Large Language Models (LLMs). We use the SEC-BERT embeddings along with the one-hot encoded vectors of the types of entities and their relation group as features. We benchmark MOAT with three such open-source LLMs, namely, Falcon, Dolly and MPT under zero-shot and few shot settings. The results prove that MOAT outperforms these LLMs.

Keyphrases: Financial texts, Relation Extraction, large language models

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:10463,
  author    = {Sohom Ghosh and Sachin Umrao and Chung-Chi Chen and Sudip Kumar Naskar},
  title     = {The Mask One at a Time Framework for Detecting the Relationship Between Financial Entities},
  howpublished = {EasyChair Preprint 10463},
  year      = {EasyChair, 2023}}
Download PDFOpen PDF in browser