The sequence of [[random variable|random variables]] $X_1, X_2, \dots, X_n$ converges in probability to a random variable $X$ if, for any $\epsilon>0$
$\lim_{n \to \infty} P(|X_n - X|>\epsilon) = 0$
We write
$X_n \overset{P}{\to} X$
In other words, the probability that the random variable $X_n$ differs from another random variable $X$ by greater than some value $\epsilon$ in absolute terms approaches $0$ as $n$ goes to infinity (i.e., the sample size becomes very large). We use probability to remove the randomness of the two random variables.
## Properties
Suppose that $X_n$ and $Y_n$ be sequences of random variables such that $X_n \overset{P}{\to}X$ and $Y_n \overset{P}{\to}Y$ for random variables X and Y. Then
$X_n + Y_n \overset{P}{\to} X + Y$
$X_nY_n \overset{P}{\to} XY$
$X_n/Y_n \overset{P}{\to} X/Y$
$\begin{align}g(X_n) \overset{P}{\to} g(X) &&\text{g(x) continuous} \end{align}$