Score: 3

Poisson Hyperplane Processes with Rectified Linear Units

Published: January 9, 2026 | arXiv ID: 2601.05586v1

By: Shufei Ge, Shijia Wang, Lloyd Elliott

Potential Business Impact:

Makes computer brains learn better and faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Neural networks have shown state-of-the-art performances in various classification and regression tasks. Rectified linear units (ReLU) are often used as activation functions for the hidden layers in a neural network model. In this article, we establish the connection between the Poisson hyperplane processes (PHP) and two-layer ReLU neural networks. We show that the PHP with a Gaussian prior is an alternative probabilistic representation to a two-layer ReLU neural network. In addition, we show that a two-layer neural network constructed by PHP is scalable to large-scale problems via the decomposition propositions. Finally, we propose an annealed sequential Monte Carlo algorithm for Bayesian inference. Our numerical experiments demonstrate that our proposed method outperforms the classic two-layer ReLU neural network. The implementation of our proposed model is available at https://github.com/ShufeiGe/Pois_Relu.git.

Country of Origin
🇨🇳 🇨🇦 China, Canada

Repos / Data Links

Page Count
24 pages

Category
Computer Science:
Machine Learning (CS)