Enabling Equitable Access to Trustworthy Financial Reasoning
By: William Jurayj, Nils Holzenberger, Benjamin Van Durme
Potential Business Impact:
Helps computers do taxes accurately and cheaply.
According to the United States Internal Revenue Service, ''the average American spends $\$270$ and 13 hours filing their taxes''. Even beyond the U.S., tax filing requires complex reasoning, combining application of overlapping rules with numerical calculations. Because errors can incur costly penalties, any automated system must deliver high accuracy and auditability, making modern large language models (LLMs) poorly suited for this task. We propose an approach that integrates LLMs with a symbolic solver to calculate tax obligations. We evaluate variants of this system on the challenging StAtutory Reasoning Assessment (SARA) dataset, and include a novel method for estimating the cost of deploying such a system based on real-world penalties for tax errors. We further show how combining up-front translation of plain-text rules into formal logic programs, combined with intelligently retrieved exemplars for formal case representations, can dramatically improve performance on this task and reduce costs to well below real-world averages. Our results demonstrate the promise and economic feasibility of neuro-symbolic architectures for increasing equitable access to reliable tax assistance.
Similar Papers
Language Models and Logic Programs for Trustworthy Financial Reasoning
Computation and Language
Helps computers do taxes accurately and cheaply.
Can LLMs Identify Tax Abuse?
Computational Finance
AI finds new ways to save money on taxes.
Taxation Perspectives from Large Language Models: A Case Study on Additional Tax Penalties
Computation and Language
Helps computers understand tax rules better.