The univariate Jeffreys prior is defined as any [[prior]] proportional to the square root of the [[Fisher information]] for the likelihood of the data
$\pi(\theta) \propto \sqrt{I(\theta)}$
Jeffreys prior is invariant to reparameterization and a response to critiques of the [[principle of indifference]]. Jeffreys prior is a type of [[objective prior]] and can be an [[improper prior]].
To find Jeffreys prior
- Find a [[likelihood]] for $\theta \ | \ x$ (often for a single data point)
- Find the log-likelihood
- Then find the [[Fisher information]] by either
- find the first derivative with respect to $\theta$ and square its [[expectation]] with respect to the data
- find the second derivative with respect to $\theta$ and negate its expectation (the second derivative is often easier)
- Finally find a prior that is proportional to the square root of the Fisher information (multiply by $n$ if using a single data point)
For multiple [[independent and identically distributed|iid]] data points $Y = Y_1, \dots, Y_n$, the Fisher information is $n$ times the marginal information.
$I(\lambda) = -nE \Big[ \frac{\partial^2}{\partial^2 \theta} \ln f(y_i|\lambda) \Big]$
Jeffreys' priors for specific distributions for the included [[conjugate family]] distributions can are
- [[Jeffreys' prior for the Poisson distribution]]
- [[Jeffreys' prior for the exponential distribution]]
- [[Jeffreys' prior for the binomial distribution]]
- [[Jeffreys' prior for the mean and variance of normally distributed data]]