Bayesian statistics is a field in statistics that considers prior knowledge when resolving probabilities. The [[Bayesian]] approach differs from the frequentist approach which considers only the long-run relative frequency of an event after many trials.
Bayesian methods include
1. **Posterior distributions for a parameter** (or set of parameters). Computed using Bayes' theorem.
2. **Point estimation**. Uses the posterior distribution to find single value to estimate an unknown parameter.
3. **Interval estimation**. Uses the posterior distribution to construct a range of values that, with some probability, contains the unknown parameter.
4. **Bayes factors**. Provides a method for testing hypothesis.
The Bayesian statistical model can be written as
$f(x, \theta) = f(x | \theta) \pi (\theta)$
where
- $f(x,\theta)$ is the [[joint probability]] of the parameters (i.e., hypotheses) and the data
- $\pi (\theta)$ is your [[prior]] distribution over the parameters
- $f(x|\theta)$ is the [[likelihood]] function. This is simply the [[conditional probability]] of $x$ given your prior.
Then
$\pi(\theta | x) = \frac{f(x, \theta)}{f(x)} = \frac{f(x | \theta) \pi (\theta)}{f(x)}$
where $\pi(\theta|x)$ is the [[posterior]] distribution, your updated belief over the parameters given the data $x$.
The denominator can be expanded to give us the more computationally feasible form
$\pi(\theta | x) = \frac{f(x | \theta) \pi (\theta)}{\int f(x | \theta) \pi(\theta) \partial \theta}$
Outline
- History of Bayes
- Comparison to Frequentist methods (Bernoulli's Fallacy)
- Bayes rule: My introduction to Bayes with COVID
- The discrete case: dice rolls
- Beta-binomial conjugate: coin toss?
- Other conjugate families
- Posterior predictive distribution
- Priors: improper, objective, uninformative, jeffreys
- MCMC
- [[Bayesian regression]]
- [[nuisance parameters]]
```dataview
TABLE
WHERE up = this.file.link
```