Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence
The non-parametric version of Amari’s dually affine Information Geometry provides a practical calculus to perform computations of interest in statistical machine learning. The method uses the notion of a statistical bundle, a mathematical structure that includes both probability densities and random...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-03-01
|
| Series: | Stats |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2571-905X/8/2/25 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850164830267244544 |
|---|---|
| author | Giovanni Pistone |
| author_facet | Giovanni Pistone |
| author_sort | Giovanni Pistone |
| collection | DOAJ |
| description | The non-parametric version of Amari’s dually affine Information Geometry provides a practical calculus to perform computations of interest in statistical machine learning. The method uses the notion of a statistical bundle, a mathematical structure that includes both probability densities and random variables to capture the spirit of Fisherian statistics. We focus on computations involving a constrained minimization of the Kullback–Leibler divergence. We show how to obtain neat and principled versions of known computations in applications such as mean-field approximation, adversarial generative models, and variational Bayes. |
| format | Article |
| id | doaj-art-e8163ac4cdf44529a792e70e5f6c4e4f |
| institution | OA Journals |
| issn | 2571-905X |
| language | English |
| publishDate | 2025-03-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Stats |
| spelling | doaj-art-e8163ac4cdf44529a792e70e5f6c4e4f2025-08-20T02:21:53ZengMDPI AGStats2571-905X2025-03-01822510.3390/stats8020025Affine Calculus for Constrained Minima of the Kullback–Leibler DivergenceGiovanni Pistone0De Castro Statistics, Collegio Carlo Alberto, 10122 Torino, ItalyThe non-parametric version of Amari’s dually affine Information Geometry provides a practical calculus to perform computations of interest in statistical machine learning. The method uses the notion of a statistical bundle, a mathematical structure that includes both probability densities and random variables to capture the spirit of Fisherian statistics. We focus on computations involving a constrained minimization of the Kullback–Leibler divergence. We show how to obtain neat and principled versions of known computations in applications such as mean-field approximation, adversarial generative models, and variational Bayes.https://www.mdpi.com/2571-905X/8/2/25information geometryKullback–Leibler divergencestatistical bundlenatural gradient |
| spellingShingle | Giovanni Pistone Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence Stats information geometry Kullback–Leibler divergence statistical bundle natural gradient |
| title | Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence |
| title_full | Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence |
| title_fullStr | Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence |
| title_full_unstemmed | Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence |
| title_short | Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence |
| title_sort | affine calculus for constrained minima of the kullback leibler divergence |
| topic | information geometry Kullback–Leibler divergence statistical bundle natural gradient |
| url | https://www.mdpi.com/2571-905X/8/2/25 |
| work_keys_str_mv | AT giovannipistone affinecalculusforconstrainedminimaofthekullbackleiblerdivergence |