The probability mass function (pmf) gives the probability that a discrete random variable takes on a certain value, usually denoted $P(X = x)$. The corollary for a continuous random variable is the [[probability density function]].
## Properties
From the axioms of probability
1. $0 <= P(X = x) <= 1$
2. $\displaystyle \sum_x P(X = x) = P(S) = 1$
3. $P(X = a \cup X=b) = P(X=a) + P(X=b)$ #check
The sum across all values of the pmf must equal 1. You can use this to confirm that any derived pmf is valid.
$\displaystyle \sum_x P(X = x) = 1$
## Deriving a pmf
A patient is waiting for a kidney transplant. The probability of the next donor being a match is $p$. Derive the pmf for this distribution.
First, let's define the sample space ($S$) as {1, 01, 001, 0001, ... }. In other words, one outcomes is the next donor is a match ($1$), the next outcome is the next donor is not a match but the following donor is ($01$), and so on.
Let $X$ be the number of donors tested until a match is found.
We can derive the probability mass function by observing the probability of each outcome until we can define a general rule. The probability of a donor match is $p$ and the probability of a donor not being a match is $1-p$.
$\displaylines{
P(X=1) = \text{\{1\}} = p \\
P(X=2) = \text{\{01\}} = (1-p) * p \\
P(X=3) = \text{\{001\}} = (1-p)^2 * p \\
P(X=4) = \text{\{0001\}} = (1-p)^3 * p \\
}$
We can see the pattern now, the $P(X=x)$ is given by
$P(X=x) = (1-p)^{x-1} * p$
We can find this pmf in a table of common distributions and recognize that this is the pmf for the [[first success distribution]].