Better with Less: Small Proprietary Models Surpass Large Language Models in Financial Transaction Understanding
By: Wanying Ding , Savinay Narendra , Xiran Shi and more
Potential Business Impact:
Finds fraud faster, saves millions yearly.
Analyzing financial transactions is crucial for ensuring regulatory compliance, detecting fraud, and supporting decisions. The complexity of financial transaction data necessitates advanced techniques to extract meaningful insights and ensure accurate analysis. Since Transformer-based models have shown outstanding performance across multiple domains, this paper seeks to explore their potential in understanding financial transactions. This paper conducts extensive experiments to evaluate three types of Transformer models: Encoder-Only, Decoder-Only, and Encoder-Decoder models. For each type, we explore three options: pretrained LLMs, fine-tuned LLMs, and small proprietary models developed from scratch. Our analysis reveals that while LLMs, such as LLaMA3-8b, Flan-T5, and SBERT, demonstrate impressive capabilities in various natural language processing tasks, they do not significantly outperform small proprietary models in the specific context of financial transaction understanding. This phenomenon is particularly evident in terms of speed and cost efficiency. Proprietary models, tailored to the unique requirements of transaction data, exhibit faster processing times and lower operational costs, making them more suitable for real-time applications in the financial sector. Our findings highlight the importance of model selection based on domain-specific needs and underscore the potential advantages of customized proprietary models over general-purpose LLMs in specialized applications. Ultimately, we chose to implement a proprietary decoder-only model to handle the complex transactions that we previously couldn't manage. This model can help us to improve 14% transaction coverage, and save more than \$13 million annual cost.
Similar Papers
Your Spending Needs Attention: Modeling Financial Habits with Transformers
Information Retrieval
Helps banks understand customers better from their spending.
Beyond Decoder-only: Large Language Models Can be Good Encoders for Machine Translation
Computation and Language
Makes computer translation faster and uses less memory.
Encoder-Decoder or Decoder-Only? Revisiting Encoder-Decoder Large Language Model
Computation and Language
Makes AI smarter and faster to use.