Namespaces
Variants
Actions

Difference between revisions of "A priori distribution"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(TeX)
Line 1: Line 1:
The probability distribution of a random variable, to be contrasted with the [[Conditional distribution|conditional distribution]] of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100501.png" /> be a pair of random variables (random vectors or more general random elements). The random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100502.png" /> is considered to be unknown, while <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100503.png" /> is considered to be the result of an observation to be used for estimation of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100504.png" />. The joint distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100505.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100506.png" /> is given by the distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100507.png" /> (now called the a priori distribution) and the set of conditional probabilities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100508.png" /> of the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a0100509.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a01005010.png" />. According to the [[Bayes formula|Bayes formula]], one can calculate the conditional probability of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a01005011.png" /> with respect to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a01005012.png" /> (which is now called the a posteriori distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010050/a01005013.png" />). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see [[Bayesian approach|Bayesian approach]].
+
{{TEX|done}}
 +
The probability distribution of a random variable, to be contrasted with the [[Conditional distribution|conditional distribution]] of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The joint distribution of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\mathbb P_\theta$ of the random variable $X$ given $\Theta=\theta$. According to the [[Bayes formula|Bayes formula]], one can calculate the conditional probability of $\Theta$ with respect to $X$ (which is now called the a posteriori distribution of $\Theta$). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see [[Bayesian approach|Bayesian approach]].
  
  

Revision as of 12:14, 27 October 2014

The probability distribution of a random variable, to be contrasted with the conditional distribution of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The joint distribution of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\mathbb P_\theta$ of the random variable $X$ given $\Theta=\theta$. According to the Bayes formula, one can calculate the conditional probability of $\Theta$ with respect to $X$ (which is now called the a posteriori distribution of $\Theta$). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see Bayesian approach.


Comments

References

[a1] E. Sverdrup, "Laws and chance variations" , 1 , North-Holland (1967) pp. 214ff
How to Cite This Entry:
A priori distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=A_priori_distribution&oldid=34101
This article was adapted from an original article by Yu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article