Smooth Approximations of the Rounding Function
By: Stanislav Semenov
Potential Business Impact:
Makes computers learn better with smoother math.
We propose novel smooth approximations to the classical rounding function, suitable for differentiable optimization and machine learning applications. Our constructions are based on two approaches: (1) localized sigmoid window functions centered at each integer, and (2) normalized weighted sums of sigmoid derivatives representing local densities. The first method approximates the step-like behavior of rounding through differences of shifted sigmoids, while the second method achieves smooth interpolation between integers via density-based weighting. Both methods converge pointwise to the classical rounding function as the sharpness parameter k tends to infinity, and allow controlled trade-offs between smoothness and approximation accuracy. We demonstrate that by restricting the summation to a small set of nearest integers, the computational cost remains low without sacrificing precision. These constructions provide fully differentiable alternatives to hard rounding, which are valuable in contexts where gradient-based methods are essential.
Similar Papers
Sharp Lower Bounds for Linearized ReLU^k Approximation on the Sphere
Numerical Analysis
Shows how fast computers learn with certain math.
Sharp Gaussian approximations for Decentralized Federated Learning
Machine Learning (Stat)
Detects computer attacks by watching how they learn.
Sharp bounds in perturbed smooth optimization
Optimization and Control
Makes computer math problems more predictable.