Kullback-Leibler Divergence for Bayesian Nonparametric Model Checking

14 Mar 2019  ·  Al-Labadi Luai, Patel Viskakh, Vakiloroayaei Kasra, Wan Clement ·

Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem. This is mainly attributed to the discreteness property of the Dirichlet process and that the Kullback-Leibler divergence between any discrete distribution and any continuous distribution is infinity. The approach proposed in this paper, which is based on incorporating the Dirichlet process, the Kullback-Leibler divergence and the relative belief ratio, is considered the first concrete solution to this issue. Applying the approach is simple and does not require obtaining a closed form of the relative belief ratio. A Monte Carlo study and real data examples show that the developed approach exhibits excellent performance.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Statistics Theory Statistics Theory