Score: 0

Analyzing Political Text at Scale with Online Tensor LDA

Published: November 11, 2025 | arXiv ID: 2511.07809v1

By: Sara Kangaslahti , Danny Ebanks , Jean Kossaifi and more

Potential Business Impact:

Lets computers understand huge amounts of text fast.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

This paper proposes a topic modeling method that scales linearly to billions of documents. We make three core contributions: i) we present a topic modeling method, Tensor Latent Dirichlet Allocation (TLDA), that has identifiable and recoverable parameter guarantees and sample complexity guarantees for large data; ii) we show that this method is computationally and memory efficient (achieving speeds over 3-4x those of prior parallelized Latent Dirichlet Allocation (LDA) methods), and that it scales linearly to text datasets with over a billion documents; iii) we provide an open-source, GPU-based implementation, of this method. This scaling enables previously prohibitive analyses, and we perform two real-world, large-scale new studies of interest to political scientists: we provide the first thorough analysis of the evolution of the #MeToo movement through the lens of over two years of Twitter conversation and a detailed study of social media conversations about election fraud in the 2020 presidential election. Thus this method provides social scientists with the ability to study very large corpora at scale and to answer important theoretically-relevant questions about salient issues in near real-time.

Page Count
64 pages

Category
Computer Science:
Machine Learning (CS)