Score: 0

Mirror Descent Using the Tempesta Generalized Multi-parametric Logarithms

Published: June 8, 2025 | arXiv ID: 2506.13984v1

By: Andrzej Cichocki

Potential Business Impact:

Makes computer learning smarter and faster.

Business Areas:
A/B Testing Data and Analytics

In this paper, we develop a wide class Mirror Descent (MD) algorithms, which play a key role in machine learning. For this purpose we formulated the constrained optimization problem, in which we exploits the Bregman divergence with the Tempesta multi-parametric deformation logarithm as a link function. This link function called also mirror function defines the mapping between the primal and dual spaces and is associated with a very-wide (in fact, theoretically infinite) class of generalized trace-form entropies. In order to derive novel MD updates, we estimate generalized exponential function, which closely approximates the inverse of the multi-parametric Tempesta generalized logarithm. The shape and properties of the Tempesta logarithm and its inverse-deformed exponential functions can be tuned by several hyperparameters. By learning these hyperparameters, we can adapt to distribution or geometry of training data, and we can adjust them to achieve desired properties of MD algorithms. The concept of applying multi-parametric logarithms allow us to generate a new wide and flexible family of MD and mirror-less MD updates.

Page Count
8 pages

Category
Statistics:
Machine Learning (Stat)