uGMM-NN: Univariate Gaussian Mixture Model Neural Network
By: Zakeria Sharif Ali
Potential Business Impact:
Makes computers understand and guess better.
This paper introduces the Univariate Gaussian Mixture Model Neural Network (uGMM-NN), a novel neural architecture that embeds probabilistic reasoning directly into the computational units of deep networks. Unlike traditional neurons, which apply weighted sums followed by fixed nonlinearities, each uGMM-NN node parameterizes its activations as a univariate Gaussian mixture, with learnable means, variances, and mixing coefficients. This design enables richer representations by capturing multimodality and uncertainty at the level of individual neurons, while retaining the scalability of standard feedforward networks. We demonstrate that uGMM-NN can achieve competitive discriminative performance compared to conventional multilayer perceptrons, while additionally offering a probabilistic interpretation of activations. The proposed framework provides a foundation for integrating uncertainty-aware components into modern neural architectures, opening new directions for both discriminative and generative modeling.
Similar Papers
GNN-based Unified Deep Learning
Machine Learning (CS)
Helps AI learn from different medical images.
Graph-Regularized Learning of Gaussian Mixture Models
Machine Learning (CS)
Shares computer learning without sharing private data.
UQGNN: Uncertainty Quantification of Graph Neural Networks for Multivariate Spatiotemporal Prediction
Machine Learning (CS)
Predicts city events and how sure it is.