The chi-squared distribution is a form of the [[gamma distribution]] parameterized as
$X \sim \Gamma(\frac{n}{2}, \frac12)$
Typically this is written with the notation for the chi-squared rather than expressed as a gamma distribution.
The chi-squared distribution arises in statistics as the sum of squared [[standard normal distribution|standard normal distributions]]. For $Z_1, \dots, Z_n \overset{iid}\sim N(0,1)$
$\sum_{i=1}^n Z_i^2 \sim \chi^2(n)$
This is especially useful when standardizing normal random variables.
$\frac{\sum(X_i - \bar X)^2}{\sigma^2} \sim \chi^2(n-1)$
A special case, the chi-squared distribution with $n=1$ is equal to the square of the standard normal distribution.
$[N(0,1)]^2 \sim \chi^2(1)$
## relationship to the exponential distribution
The sum of exponential random variables can be converted to a chi-squared random variable by multiplying by $2\lambda$.
Suppose that $X_1, X_2, \dots, X_n$ is a random sample from the [[exponential distribution]] with rate $\lambda>0$. We know that the sum of exponential random variables has the [[gamma distribution]] $\Gamma(n, \lambda)$ and that for a given $X \sim \Gamma(\alpha, \beta)$ and a constant $c > 0$, $Y = cX$ can be modeled as $Y \sim \Gamma(\alpha, \frac{\beta}{c})$. Therefore,
$2 \lambda \sum_{i=1}^n X_i \sim \Gamma \Big (n, \frac12 \Big ) = \Gamma \Big (\frac{2n}{2}, \frac12 \Big ) = \chi^2(2n)$
The sample mean from an exponential distribution also has the gamma distribution $\bar X = \frac1n \sum X_i \sim \Gamma(n, n\lambda)$and therefore can be transformed to a chi-squared random variable by multiplying by $2n\lambda$.
$2n\lambda \bar X \sim \Gamma \Big (n, \frac12 \Big ) = \Gamma \Big (\frac{2n}{2}, \frac12 \Big ) = \chi^2(2n)$
## Notation
$X \sim \chi^2(n)$
where $n$ is the [[degrees of freedom]].
## Probability Density Function
$f(x) = \frac{1}{\Gamma(n/2)}(\frac12)^{n/2}x^{(n/2)-1}e^{-x/2} \ I_{(0,\infty)}(x)$
## Expected Value
$E(X) = n$
## Variance
$V(X) = 2n$
## Moment generating function
$M_x(t) = \Big (\frac{1/2}{1/2 - t} \Big )^{n/2}$
## Alternative notation
Some authors will use the alternative notation for the gamma distribution where $\beta$ is inverted, as below.
$X \sim \Gamma(\frac{n}{2}, 2)$
## Transformations
Suppose that $X_1, X_2, \dots, X_k$ are independent random variables with distribution $X \sim \chi^2(n_i)$. These random variables are independent but not identically distributed as each as its own degrees of freedom ($n_i$). The sum of these random variables is also chi-squared with degrees of freedom equal to the sum of the individual random variables.
$X \sim \chi^2(n_1 + n_2 + \dots + n_k)$
If the random variables are [[independent and identically distributed|iid]], the sum also has a chi-squared distribution with $kn$ degrees of freedom.
$X \sim \chi^2(kn)$
The difference of two chi-squared random variables is *not* also chi-squared. However, given two independent random variables $X_2$ and $X_3$ where $X_1 \sim \chi^2(n1)$, $X_2 \sim \chi^2(n2)$ and $X_1 = X_2 + X_3$, then
$X_3 = X_1 - X_2 \sim \chi^2(n_1 - n_2)$
This is a somewhat contrived example where we've built up chi-squared random variables to ensure their difference will have no negative values. You can use this fact to find that the distribution of the [[sample variance]] has the chi-square distribution.