$X_1, X_2, \dots, X_n$ are [[independent]] and [[identically distributed]] (*iid*) if all $X$ are independent and each random variable has the same distribution.
A random (or equal probability) sample, if successful, is iid.
This assumption is powerful in collapsing down a potentially high-dimensional problem to a more tractable problem. Consider the simple model $(X, f(x; \theta))$.
The fully-specified model where $X \sim N_n(\vec \mu, \Sigma)$ requires a vector of means $\mu$ of length $n$ and a [[variance-covariance matrix]] $\Sigma$ with dimension $n \times n$.
By assuming that $X$ is iid, we can collapse the vector of means to a single value $\mu$, set the off-diagonal covariances in $\Sigma$ to 0 and collapse the diagonal to a single value $\sigma^2$.