For two [[continuous random variable|continuous random variables]] $X$ and $Y$, $f(x,y)$ is the joint [[probability density function]] for $X$ and $Y$ if
$P(a<=X<=b,c<=Y<=d) = \int_a^b \int_c^d f(x,y) \ dx \ dy$
## for a sample
When we take a sample, the observed data is a collection of measurements $X_1, X_2, X_3, \dots, X_n$ where each $X_n$ is a measure from a some distribution, and each $X$ has the same distribution. The distribution for all variables is the joint pdf is the product of each individual pdf.
$f(x_1, x_2, \dots, x_n; \theta) = \prod_{i=1}^n f(x_i; \theta)$
Instead of writing each $x_i$, we can write the left hand side as a [[vector]] $f(\vec{x};\theta)$.
> [!NOTE]
> We can take the product of each univariate pdf or pmf because our data are independent.
## Marginal pdf
The [[marginal probability]] can be found by integrating the joint pdfs over all of the values for the variable we don't want. The marginal pdfs can be written as
$f_x(x) = \int f(x,y) \ dy$
$f_y(y) = \int f(x,y) \ dx$