Bloom Filter Encoding for Machine Learning
By: John Cartmell, Mihaela Cardei, Ionut Cardei
We present a method that uses the Bloom filter transform to preprocess data for machine learning. Each sample is encoded into a compact, privacy-preserving bit array. This reduces memory use and protects the original data while keeping enough structure for accurate classification. We test the method on six datasets: SMS Spam Collection, ECG200, Adult 50K, CDC Diabetes, MNIST, and Fashion MNIST. Four classifiers are used: Extreme Gradient Boosting, Deep Neural Networks, Convolutional Neural Networks, and Logistic Regression. Results show that models trained on Bloom filter encodings achieve accuracy similar to models trained on raw data or other transforms. At the same time, the method provides memory savings while enhancing privacy. These results suggest that the Bloom filter transform is an efficient preprocessing approach for diverse machine learning tasks.
Similar Papers
Learned LSM-trees: Two Approaches Using Learned Bloom Filters
Data Structures and Algorithms
Makes computer storage faster and use less memory.
Discrete approach to machine learning
Machine Learning (CS)
Helps computers understand languages like brains do.
Extrapolation of Periodic Functions Using Binary Encoding of Continuous Numerical Values
Machine Learning (CS)
Makes computers learn patterns they haven't seen.