Regularized divergences between covariance operators and Gaussian measures on Hilbert spaces

10 Apr 2019  ·  Quang Minh Ha ·

This work presents an infinite-dimensional generalization of the correspondence between the Kullback-Leibler and R\'enyi divergences between Gaussian measures on Euclidean space and the Alpha Log-Determinant divergences between symmetric, positive definite matrices. Specifically, we present the regularized Kullback-Leibler and R\'enyi divergences between covariance operators and Gaussian measures on an infinite-dimensional Hilbert space, which are defined using the infinite-dimensional Alpha Log-Determinant divergences between positive definite trace class operators... We show that, as the regularization parameter approaches zero, the regularized Kullback-Leibler and R\'enyi divergences between two equivalent Gaussian measures on a Hilbert space converge to the corresponding true divergences. The explicit formulas for the divergences involved are presented in the most general Gaussian setting. read more

PDF Abstract
No code implementations yet. Submit your code now

Categories


Probability Functional Analysis