Discriminative classification with generative features: bridging Naive Bayes and logistic regression
By: Zachary Terner, Alexander Petersen, Yuedong Wang
Potential Business Impact:
Makes computer guesses smarter by combining two methods.
We introduce Smart Bayes, a new classification framework that bridges generative and discriminative modeling by integrating likelihood-ratio-based generative features into a logistic-regression-style discriminative classifier. From the generative perspective, Smart Bayes relaxes the fixed unit weights of Naive Bayes by allowing data-driven coefficients on density-ratio features. From a discriminative perspective, it constructs transformed inputs as marginal log-density ratios that explicitly quantify how much more likely each feature value is under one class than another, thereby providing predictors with stronger class separation than the raw covariates. To support this framework, we develop a spline-based estimator for univariate log-density ratios that is flexible, robust, and computationally efficient. Through extensive simulations and real-data studies, Smart Bayes often outperforms both logistic regression and Naive Bayes. Our results highlight the potential of hybrid approaches that exploit generative structure to enhance discriminative performance.
Similar Papers
Interpretable Generative and Discriminative Learning for Multimodal and Incomplete Clinical Data
Machine Learning (Stat)
Helps doctors understand sick people better.
LLM-Augmented and Fair Machine Learning Framework for University Admission Prediction
Computers and Society
Helps colleges pick students more fairly.
From Partial Exchangeability to Predictive Probability: A Bayesian Perspective on Classification
Methodology
Helps computers guess better with less data.