# A priori distribution

Jump to: navigation, search

The probability distribution of a random variable, to be contrasted with the conditional distribution of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The joint distribution of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\mathrm P_\theta$ of the random variable $X$ given $\Theta=\theta$. According to the Bayes formula, one can calculate the conditional probability of $\Theta$ with respect to $X$ (which is now called the a posteriori distribution of $\Theta$). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see Bayesian approach.