Score: 0

The Physics of Data and Tasks: Theories of Locality and Compositionality in Deep Learning

Published: October 7, 2025 | arXiv ID: 2510.06106v1

By: Alessandro Favero

Potential Business Impact:

Finds hidden patterns in data for smarter learning.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Deep neural networks have achieved remarkable success, yet our understanding of how they learn remains limited. These models can learn high-dimensional tasks, which is generally statistically intractable due to the curse of dimensionality. This apparent paradox suggests that learnable data must have an underlying latent structure. What is the nature of this structure? How do neural networks encode and exploit it, and how does it quantitatively impact performance - for instance, how does generalization improve with the number of training examples? This thesis addresses these questions by studying the roles of locality and compositionality in data, tasks, and deep learning representations.

Page Count
333 pages

Category
Computer Science:
Machine Learning (CS)