Abstract
We show that the relative entropy between a posterior density formed from a smooth likelihood and prior and a limiting normal form tends to zero in the independent and identically distributed case. The mode of convergence is in probability and in mean. Applications to codelengths in stochastic complexity and to sample size selection are briefly discussed.
Original language | English (US) |
---|---|
Pages (from-to) | 165-176 |
Number of pages | 12 |
Journal | IEEE Transactions on Information Theory |
Volume | 45 |
Issue number | 1 |
DOIs | |
State | Published - 1999 |
Externally published | Yes |
Keywords
- Asymptotic normality
- Posterior density
- Relative entropy
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences