Abstract
Suppose X1,...,Xn are IID p(· θ,ψ) where (θ,ψ)∈ℝd is distributed according to the prior density w(·). For estimators Sn=S(X) and Tn=T(X) assumed to be consistent for some function of θ and asymptotically normal, we examine the conditional Shannon mutual information (CSMI) between Θ and Tn given Ψ and Sn, I(Θ,Tn Ψ,Sn). It is seen there are several important special cases of this CSMI. We establish asymptotic formulas for various cases and identify the resulting noninformative reference priors. As a consequence, we develop the notion of data-dependent priors and a calibration for how close an estimator is to sufficiency.
Original language | English (US) |
---|---|
Pages (from-to) | 313-345 |
Number of pages | 33 |
Journal | Journal of Statistical Planning and Inference |
Volume | 123 |
Issue number | 2 |
DOIs | |
State | Published - Jul 1 2004 |
Keywords
- Data dependent priors
- Entropy asymtotics
- Objective priors
- Posterior normality
- Shannon information
- Sufficiency
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Applied Mathematics