Refinements of Jensen's Inequality for Twice-Differentiable Convex Functions with Bounded Hessian
By: Sambhab Mishra
Potential Business Impact:
Improves math for better computer understanding.
Jensen's inequality, attributed to Johan Jensen -- a Danish mathematician and engineer noted for his contributions to the theory of functions -- is a ubiquitous result in convex analysis, providing a fundamental lower bound for the expectation of a convex function. In this paper, we establish rigorous refinements of this inequality specifically for twice-differentiable functions with bounded Hessians. By utilizing Taylor expansions with integral remainders, we tried to bridge the gap between classical variance-based bounds and higher-precision estimates. We also discover explicit error terms governed by Gruss-type inequalities, allowing for the incorporation of skewness and kurtosis into the bound. Using these new theoretical tools, we improve upon existing estimates for the Shannon entropy of continuous distributions and the ergodic capacity of Rayleigh fading channels, demonstrating the practical efficacy of our refinements.
Similar Papers
Tight Bounds for Jensen's Gap with Applications to Variational Inference
Machine Learning (CS)
Finds better ways to guess answers in smart programs.
Tight Bounds on Jensen's Gap: Novel Approach with Applications in Generative Modeling
Machine Learning (CS)
Improves how computers learn from data.
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.