OLC-WA: Drift Aware Tuning-Free Online Classification with Weighted Average
By: Mohammad Abu Shaira , Yunhe Feng , Heng Fan and more
Real-world data sets often exhibit temporal dynamics characterized by evolving data distributions. Disregarding this phenomenon, commonly referred to as concept drift, can significantly diminish a model's predictive accuracy. Furthermore, the presence of hyperparameters in online models exacerbates this issue. These parameters are typically fixed and cannot be dynamically adjusted by the user in response to the evolving data distribution. This paper introduces Online Classification with Weighted Average (OLC-WA), an adaptive, hyperparameter-free online classification model equipped with an automated optimization mechanism. OLC-WA operates by blending incoming data streams with an existing base model. This blending is facilitated by an exponentially weighted moving average. Furthermore, an integrated optimization mechanism dynamically detects concept drift, quantifies its magnitude, and adjusts the model based on the observed data stream characteristics. This approach empowers the model to effectively adapt to evolving data distributions within streaming environments. Rigorous empirical evaluation across diverse benchmark datasets shows that OLC-WA achieves performance comparable to batch models in stationary environments, maintaining accuracy within 1-3%, and surpasses leading online baselines by 10-25% under drift, demonstrating its effectiveness in adapting to dynamic data streams.
Similar Papers
Weighted Contrastive Learning for Anomaly-Aware Time-Series Forecasting
Machine Learning (CS)
Helps predict cash needs even when things change.
Multi-level Collaborative Distillation Meets Global Workspace Model: A Unified Framework for OCIL
Machine Learning (CS)
Helps computers learn new things without forgetting old ones.
Modality-Balanced Collaborative Distillation for Multi-Modal Domain Generalization
CV and Pattern Recognition
Helps AI learn from different senses better.