The sum of $n$ [[independent and identically distributed|iid]] random variables from the [[normal distribution]] $X\sim N(\mu, \sigma^2)$ has the normal distribution
$X\sim N(n\mu, n\sigma^2)$
In fact, any linear combination of normal random variables is also normally distributed with mean equal to the sum of the individual means and variance equal to the sum of the individual variances. This holds even when their means and variances differ (i.e., they are not [[identically distributed]]) and they are not [[independent]].
If $X_1, X_2, X_3, \dots, X_n$ where $X \sim N(u_i, \sigma^2_i)$ then
$\sum_{i=1}^n X_i \sim N(\sum_{i=1}^n \mu_i, \sum_{i=1}^n\sigma^2_i)$
For any constants $a_1, a_2, \dots, a_n$ (that are not all zero) and any constant $b$,
$a_1X_1 + a_2X_2 + \dots + a_nX_n + b$
has a normal distribution
$\sum_{i=1}^n a_iX_i + b \sim N \Big ( \mu \sum_{i=1}^n a_i + b, \ \sigma^2 \sum_{i=1}^n a_i^2 \Big )$
Since [[expectation]] is a linear operator,
$E \Big [ \sum_{i=1}^n a_iX_i + b \Big ] = E \Big [ \sum_{i=1}^n a_iX_i \Big ] + b = \sum_{i=1}^n E[a_iX_i] + b = \sum_{i=1}^n a_iE[X_i] + b = \sum_{i=1}^n a_i \mu + b = \mu \sum_{i=1}^n a_i + b$
Under independence, [[variance]] can be calculated as
$V \Big [\sum_{i=1}^n a_iX_i + b \Big] = V \Big [\sum_{i=1}^n a_iX_i \Big] \overset{indep}{=} \sum_{i=1}^n V [a_iX_i] = \sum_{i=1}^n a_i^2 V [X_i] = \sum_{i=1}^n a_i^2 \sigma^2 = \sigma^2 \sum_{i=1}^n a_i^2$
A linear combination of particular importance to us is the sample mean (recall the $1/n$ factor is squared in front of the variance term).
$\bar X = \frac1n \sum_{i=1}^n \sim N(\mu, \frac{\sigma^2}{n})$
## a word of caution
If $X_1$ and $X_2$ are [[independent and identically distributed|iid]] $N(0,1)$ random variables, the distribution of $X_1 + X_2$ is not the same as $2*X_1$.
This may be surprising. If we consider that $X_1$ is equal to $X_2$, it's tempting to substitute one for the other in the calculation of variance and, from the properties of variance, we know to square the term $a$ when factoring it.
$Var(X_1 + X_2) \ne Var(2 * X_1) = 2^2 * Var(X_1)$
This would give us $2^2 * 1 = 4$ for the variance of $X_1 + X_2$.
However, if we instead consider $X_1$ and $X_2$ to be distinct, although sharing the same mean and variance, we would instead have
$Var(X_1 + X_2) \overset{indep}{=} Var(X_1) + Var(X_2)$
so the $Var(X_1) + Var(X_2) = 1 + 1 = 2$ not $4$!
Similarly, a difference between two *iid* random variables $X_1$ and $X_2$ must be treated with equal caution.
$V(X_1 - X_2) \ne Var(X_1) - Var(X_2)$
Instead
$\displaylines{
\begin{align}
V(X_1 - X_2) &= V(X_1 + (-1)X_2) \\
&= V(X_1) + V[(-1)X_2] \\
&= V(X_1) + (-1)^2V(X_2) \\
&= V(X_1) + V(X_2)
\end{align}}
$