Partial information reference priors: Derivation and interpretations

Bertrand Clarke, A. Yuan

Research output: Contribution to journalArticlepeer-review

15 Scopus citations


Suppose X1,...,Xn are IID p(· θ,ψ) where (θ,ψ)∈ℝd is distributed according to the prior density w(·). For estimators Sn=S(X) and Tn=T(X) assumed to be consistent for some function of θ and asymptotically normal, we examine the conditional Shannon mutual information (CSMI) between Θ and Tn given Ψ and Sn, I(Θ,Tn Ψ,Sn). It is seen there are several important special cases of this CSMI. We establish asymptotic formulas for various cases and identify the resulting noninformative reference priors. As a consequence, we develop the notion of data-dependent priors and a calibration for how close an estimator is to sufficiency.

Original languageEnglish (US)
Pages (from-to)313-345
Number of pages33
JournalJournal of Statistical Planning and Inference
Issue number2
StatePublished - Jul 1 2004


  • Data dependent priors
  • Entropy asymtotics
  • Objective priors
  • Posterior normality
  • Shannon information
  • Sufficiency

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'Partial information reference priors: Derivation and interpretations'. Together they form a unique fingerprint.

Cite this