Technical Challenges in Maintaining Tax Prep Software with Large Language Models
By: Sina Gogani-Khiabani , Varsha Dewangan , Nina Olson and more
Potential Business Impact:
Automates tax software updates from new laws.
As the US tax law evolves to adapt to ever-changing politico-economic realities, tax preparation software plays a significant role in helping taxpayers navigate these complexities. The dynamic nature of tax regulations poses a significant challenge to accurately and timely maintaining tax software artifacts. The state-of-the-art in maintaining tax prep software is time-consuming and error-prone as it involves manual code analysis combined with an expert interpretation of tax law amendments. We posit that the rigor and formality of tax amendment language, as expressed in IRS publications, makes it amenable to automatic translation to executable specifications (code). Our research efforts focus on identifying, understanding, and tackling technical challenges in leveraging Large Language Models (LLMs), such as ChatGPT and Llama, to faithfully extract code differentials from IRS publications and automatically integrate them with the prior version of the code to automate tax prep software maintenance.
Similar Papers
Language Models and Logic Programs for Trustworthy Financial Reasoning
Computation and Language
Helps computers do taxes accurately and cheaply.
An LLM Agentic Approach for Legal-Critical Software: A Case Study for Tax Prep Software
Software Engineering
Makes computer tax laws work better.
Taxation Perspectives from Large Language Models: A Case Study on Additional Tax Penalties
Computation and Language
Helps computers understand tax rules better.