The sum of multiple [[independent and identically distributed]] [[random variable|random variables]] in some cases has the same or a new distribution. You can use [[moment generating function]] to determine which distribution the sum has, but here is a list
## Bernoulli
The sum of $n$ *iid* [[Bernoulli distribution]] random variables $X\sim bernoulli(p)$ has the [[Binomial distribution]]
$X\sim bin(n,p)$
## Binomial
The sum of $m$ *iid* [[Binomial distribution]] random variables $X\sim bin(n,p)$ has the binomial distribution
$X\sim bin(mn,p)$
## Exponential
The sum of $n$ *iid* [[exponential distribution]] random variables $X\sim exp(rate=\lambda)$ has the [[gamma distribution]]
$X\sim \Gamma(n,\lambda)$
## Gamma
The sum of $n$ *iid* [[gamma distribution]] random variables $X\sim \gamma(\alpha, \beta)$ has the gamma distribution
$X\sim \Gamma(n\alpha, \beta)$
## Poisson
The sum of $n$ *iid* [[Poisson distribution]] random variables $X\sim Pois(\lambda)$ has the Poisson distribution
$X\sim Pois(n\lambda)$
## Normal (Gaussian)
The sum of $n$ *iid* [[normal distribution]] random variables $X\sim N(\mu, \sigma^2)$ has the normal distribution
$X\sim N(n\mu, n\sigma^2)$
In the case of the normal distribution, the random variables do not need to be independent for this result to hold. See [[linear combination of normal random variables]] for more detail.
## Chi-squared
The sum of $k$ *iid* [[chi-squared distribution]] random variables also has a chi-squared distribution with $kn$ degrees of freedom.
$X \sim \chi^2(kn)$
[[base/Statistics/convolution]]
> [!Tip]- Additional Resources
> - [Convolutions | Why X+Y in probability is a beautiful mess](https://youtu.be/IaSGqQa5O-M?si=dgNqXIsREx-hinQR) | 3Blue1Brown