Diachronic ("related to change over time") is the interpretation of [[Bayes rule]] that is used in [[Bayesian statistics]]. This interpretation provides a way to update the probability of a hypothesis $H$ given some data $D$. Using data and a prior probability to compute a posterior probability is called a **Bayesian update**.
$P(H|D) = \frac{P(H)P(D|H)}{P(D)}$
- $P(H|D)$ is the probability of the hypothesis given the data, the [[posterior]]
- $P(H)$ is the prior probability of the hypothesis, the [[prior]]
- $P(D|H)$ is the probability of the data under the hypothesis, called the [[likelihood]]
- $P(D)$ is the total probability of the data under any hypothesis, often called the **normalizing constant**
Computing the total probability of the data $P(D)$ is often the trickiest part. One approach is to specify a set of hypotheses that are [[mutually exclusive and collectively exhaustive]] and use the [[Law of Total Probability]].
$P(D) = P(D|H) \cdot P(H) + P(D|H') \cdot P(H')$
The total probability of the data is called the normalizing constant because it ensures that $P(H|D)$ is a proper probability distribution and sums to $1$. Because it is simply a constant, it can be excluded in some cases, for example to find the [[maximum a posteriori estimator]], or whenever you simply need the calculated posterior distribution to be proportional to the true posterior distribution (true under our assumptions and data).