From Tables to Signals: Revealing Spectral Adaptivity in TabPFN
By: Jianqiao Zheng , Cameron Gordon , Yiping Ji and more
Potential Business Impact:
Makes computers fix blurry pictures without training.
Task-agnostic tabular foundation models such as TabPFN have achieved impressive performance on tabular learning tasks, yet the origins of their inductive biases remain poorly understood. In this work, we study TabPFN through the lens of signal reconstruction and provide the first frequency-based analysis of its in-context learning behavior. We show that TabPFN possesses a broader effective frequency capacity than standard ReLU-MLPs, even without hyperparameter tuning. Moreover, unlike MLPs whose spectra evolve primarily over training epochs, we find that TabPFN's spectral capacity adapts directly to the number of samples provided in-context, a phenomenon we term Spectral Adaptivity. We further demonstrate that positional encoding modulates TabPFN's frequency response, mirroring classical results in implicit neural representations. Finally, we show that these properties enable TabPFN to perform training-free and hyperparameter-free image denoising, illustrating its potential as a task-agnostic implicit model. Our analysis provides new insight into the structure and inductive biases of tabular foundation models and highlights their promise for broader signal reconstruction tasks.
Similar Papers
nanoTabPFN: A Lightweight and Educational Reimplementation of TabPFN
Machine Learning (CS)
Makes smart computer models easy to learn.
TabPFN-2.5: Advancing the State of the Art in Tabular Foundation Models
Machine Learning (CS)
Makes computers learn from bigger, more complex data.
On Finetuning Tabular Foundation Models
Machine Learning (CS)
Makes smart computer tables learn better and faster.