The gamma distribution is important primarily because the [[chi-squared distribution]] is a special case of the gamma distribution. The [[gamma function]] $\Gamma(\alpha)$ is a constant that ensures the pdf integrates to $1$ (note the gamma function is parameterized by only $\alpha$, whereas the gamma distribution is parameterized by both $\alpha$ and $\beta$). In our parameterization of the gamma distribution, $\alpha$ is the "shape" parameter and $\beta$ is the "inverse scale" parameter.
## Notation
$X \sim \Gamma(\alpha, \beta)$
## Probability Density Function
$f(x) = \frac{1}{\Gamma(\alpha)}\beta^\alpha x^{\alpha-1} e^{-\beta x}$
for $\alpha > 0$ and $\beta > 0$.
## Expected Value
$E(X) = \frac{\alpha}{\beta}$
## Variance
$V(X) = \frac{\alpha}{\beta ^ 2}$
## Moment generating function
$M_x(t) = \Big (\frac{\beta}{\beta - t} \Big )^\alpha$
## Alternative notation
When $\beta$ is provided as a "scale" parameter, rather than an "inverse-scale" parameter, the pdf looks like this
$f(x) = \frac{1}{\Gamma(\alpha)} \frac{1}{\beta^\alpha} x^{\alpha-1} e^{-\frac{x}{\beta}}$
Alpha is known as a "shape" parameter. For $\alpha > 1$, the y-intercept is $0$. For $\alpha = 1$, the y intercept is given by the constants and the function is monotonically decreasing. For $0 < \alpha < 1$, the function asymptotes at both $x=0$ and $y=0$.
## Transformations
Given $X \sim \Gamma(\alpha, \beta)$ and a constant $c > 0$, $Y = cX$ can be modeled as
$Y \sim \Gamma(\alpha, \frac{\beta}{c})$
provided $\beta$ is an "inverse scale" parameter.