The Jeffreys prior for data from a Poisson distribution is
$\pi_J(\lambda) \propto \sqrt{\frac{n}{\lambda}}$
This Jeffreys prior can be thought of as a $\Gamma(0.5, 0)$ distribution, although Gamma distributions are typically defined for positive rate parameters (the second parameter). When used as a prior in the [[gamma-Poisson conjugate]] family, it often leads to a proper Gamma posterior of the form $\pi(\lambda) \sim \Gamma(n \bar x +0.5, n+0)$.
## derivation
A likelihood for the Poisson distribution can be expressed as
$
f(y|\lambda) = \frac{e^{-\lambda}\lambda^y}{y!} \propto e^{-\lambda}\lambda^y
$
The log-likelihood is
$\ln f(y|\lambda) = -\lambda + y \ln \lambda$
The first derivative is
$\frac{\partial}{\partial \lambda} \ln f(y|\lambda) = -1 + \frac{y}{\lambda}$ The second derivative is
$\frac{\partial^2}{\partial^2 \lambda} \ln f(y|\lambda) = -\frac{y}{\lambda^2}$
The expectation of the Poisson distribution is $E[y] = \lambda$. The Fisher information is the negative of the expectation (multiplied by $n$ since we're using a singe data point).
$I(\lambda) = -nE \Big[ \frac{\partial^2}{\partial^2 \theta} \ln f(y|\lambda) \Big] = -nE \Big[- \frac{y}{\lambda^2} \Big] = n\frac{E[y]}{\lambda^2} = n\frac{\lambda}{\lambda^2} = \frac{n}{\lambda}$
Note that in the expectation the parameter $\lambda$ is considered a constant, the expectation is with respect to the data $y$.
Thus, the Jeffreys prior for a Poisson distribution is
$\pi(\lambda) \propto \sqrt{I(\lambda)} \propto \sqrt{\frac{n}{\lambda}}$