Bridging Control Variates and Regression Adjustment in A/B Testing: From Design-Based to Model-Based Frameworks
By: Yu Zhang, Bokui Wan, Yongli Qin
Potential Business Impact:
Makes online tests fairer and more accurate.
A B testing serves as the gold standard for large scale, data driven decision making in online businesses. To mitigate metric variability and enhance testing sensitivity, control variates and regression adjustment have emerged as prominent variance reduction techniques, leveraging pre experiment data to improve estimator performance. Over the past decade, these methods have spawned numerous derivatives, yet their theoretical connections and comparative properties remain underexplored. In this paper, we conduct a comprehensive analysis of their statistical properties, establish a formal bridge between the two frameworks in practical implementations, and extend the investigation from design based to model-based frameworks. Through simulation studies and real world experiments at ByteDance, we validate our theoretical insights across both frameworks. Our work aims to provide rigorous guidance for practitioners in online controlled experiments, addressing critical considerations of internal and external validity. The recommended method control variates with group specific coefficient estimates has been fully implemented and deployed on ByteDance's experimental platform.
Similar Papers
Beyond Basic A/B testing: Improving Statistical Efficiency for Business Growth
Methodology
Improves website tests for better business results.
Design-based finite-sample analysis for regression adjustment
Statistics Theory
Makes study results more accurate, even with lots of data.
Unifying regression-based and design-based causal inference in time-series experiments
Methodology
Helps doctors test treatments on one person.