From KL Divergence to Wasserstein Distance: Enhancing Autoencoders with FID Analysis
Variational Autoencoders (VAEs) are popular Bayesian inference models that excel at approximating complex data distributions in a lower-dimensional latent space. Despite their widespread use, VAEs frequently face challenges in image generation, often resulting in blurry outputs. This outcome is pri...
Saved in:
| Main Authors: | Laxmi Kanta Poudel, Kshtiz Aryal, Rajendra Bahadur Thapa, Sushil Poudel |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
LibraryPress@UF
2025-05-01
|
| Series: | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| Online Access: | https://journals.flvc.org/FLAIRS/article/view/139006 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Enzyme sequence optimisation via Gromov-Wasserstein Autoencoders integrating MSA techniques
by: Xuze Wang, et al.
Published: (2025-12-01) -
An Empirical Study of Self-Supervised Learning with Wasserstein Distance
by: Makoto Yamada, et al.
Published: (2024-10-01) -
Towards Analysis of Covariance Descriptors via Bures–Wasserstein Distance
by: Huajun Huang, et al.
Published: (2025-07-01) -
An anatomically enhanced and clinically validated framework for lung abnormality classification using deep features and KL divergence
by: Suresh Kumar Samarla, et al.
Published: (2025-06-01) -
Reconstructing discrete measures from projections. Consequences on the empirical Sliced Wasserstein Distance
by: Tanguy, Eloi, et al.
Published: (2024-11-01)