Namespaces
Variants
Actions

Sample function

From Encyclopedia of Mathematics
Jump to: navigation, search


sample path

A function $ X _ {t} = X _ {t} ( \omega ) $ of an argument $ t $ which unambiguously corresponds to each observation of a random process $ X _ {t} \in E $, $ t \in T $, where $ \{ \omega \} = \Omega $ is a set of elementary events. The terms "realization of a random processrealization" and "trajectory of a random processtrajectory" , which are equivalent to "sample function" and "sample path" , are also frequently employed. A random process $ X _ {t} $ is characterized by a probability measure in the space of the sample function. In studying the local properties of the sample function $ X _ {t} $( where $ E = \mathbf R ^ {1} $, and $ T = \mathbf R ^ {m} $ is the Euclidean space of dimension $ m = 1, 2 , . . . $) it is assumed that $ X _ {t} $ is a separable random process or that an equivalent random process with given local properties of the sample function can be found. The local properties of the sample functions of Gaussian processes (cf. Gaussian process) have been most extensively studied.

For Gaussian random processes (fields) $ X _ {t} $ the following holds: Almost all sample functions $ X _ {t} $ are either continuous or unbounded over some interval. For $ t, s \in T $ a "distance" is defined by $ d ( t, s) = [ {\mathsf E} | X _ {t} - X _ {s} | ^ {2} ] ^ {1/2} $, $ B ( t, \delta ) = \{ {s } : {d ( s, t) \leq \delta } \} $ is a "ball" , and $ N ( \delta ) $ is the minimum number of such "balls" which cover $ T \subset \mathbf R ^ {n} $, further $ \sup _ {s , t \in T } d( s, t) < \infty $. A necessary and sufficient condition for the continuity of the sample function of a homogeneous Gaussian process has the form

$$ \exists q > 1 : \sum q ^ {- n } \sqrt { \mathop{\rm ln} N ( q ^ {- n } ) } < \infty . $$

If

$$ R ( t) = {\mathsf E} X _ {s} X _ {t+} s = \ \int\limits _ {- \infty } ^ \infty e ^ {it \lambda } \ dF ( \lambda ),\ {\mathsf E} X _ {t} = 0, $$

is concave in some neighbourhood of the point $ 0+ $, then for the sample function $ X _ {t} $ to be continuous it is necessary and sufficient that $ \sum S _ {n} ^ {1/2} < \infty $, where $ S _ {n} = F( 2 ^ {n+} 1 ) - F( 2 ^ {n} ) $. If $ R $ is concave in a neighbourhood of $ 0+ $ and if

$$ {\mathsf E} | X _ {t} - X _ {s} | ^ {2} \geq \ \frac{C}{| \mathop{\rm ln} | t - s | | } $$

for $ | t - s | < \delta $, almost all sample functions of the Gaussian random process $ X _ {t} $ are unbounded. If

$$ {\mathsf E} | X _ {t} - X _ {s} | ^ {2} \leq \ \frac{C}{| \mathop{\rm ln} | t - s | | ^ {1 + \epsilon } } ,\ \ \epsilon > 0, $$

almost all sample functions of the Gaussian random process (field) $ X _ {t} $ are continuous. For the sample function of a Gaussian random process to be continuous it is necessary and sufficient that

$$ \int\limits _ { 0 } ^ \infty \omega _ {R} ( e ^ {- x ^ {2} } ) dx < \infty , $$

where $ R ( t, s) = {\mathsf E} X _ {t} X _ {s} $,

$$ \omega _ {k} ( \delta ) = \sup \ [ R ( t + h _ {1} , s + h _ {2} ) - R ( t, s)] ^ {1/2} . $$

Here, the supremum is taken over $ | h _ {i} | < \delta $, $ | t | \leq C $, $ | s | \leq C $. The sample function $ X _ {t} $, $ t \in \mathbf R ^ {n} $, is in the class $ H( C, \alpha _ {1} \dots \alpha _ {n} ) $ if for all sufficiently small $ h _ {i} $,

$$ | X _ {t+} h - X _ {t} | \leq \ C \sum _ {i = 1 } ^ { n } | h _ {i} | ^ {\alpha _ {i} } , $$

$$ C > 0,\ 0 < \alpha _ {i} \leq 1,\ h = ( h _ {1} \dots h _ {n} ). $$

If $ \xi _ {t} $ is a Gaussian random field on the unit cube $ V _ {n} ^ {0} $ in $ \mathbf R ^ {n} $ such that for sufficiently small $ h $ and $ t \in V _ {n} ^ {0} $,

$$ {\mathsf E} | X _ {t+} h - X _ {t} | ^ {2} \leq \ C _ {1} \frac{| h | ^ \gamma }{| \mathop{\rm ln} | h | | } ,\ \ C _ {1} > 0,\ 0 < \gamma \leq 2, $$

then, with probability one, uniformly in $ t \in V _ {n} ^ {0} $,

$$ X _ {t} \in H ( C, \beta _ {1} \dots \beta _ {n} ) $$

for any $ C > 0 $ and $ \beta _ {i} \leq \gamma /2 $.

A non-decreasing continuous function $ \phi ( x) $, $ x \in \mathbf R ^ {1} $, is called an upper function if for almost all $ \omega $ there exists an $ \epsilon = \epsilon ( \omega ) $ such that

$$ | X _ {t} - X _ {s} | \leq ( {\mathsf E} | X _ {t} - X _ {s} | ^ {2} ) ^ {1/2} \phi \left ( \frac{1}{| t - s | } \right ) $$

for $ | t - s | \leq \epsilon $; $ t, s \in \mathbf R ^ {n} $; $ | t | = ( \sum _ {i=} 1 ^ {n} t _ {i} ^ {2} ) ^ {1/2} $. If $ X _ {t} $ is a Gaussian random field with

$$ {\mathsf E} X _ {t} = 0,\ \ {\mathsf E} X _ {t} X _ {s} = { \frac{1}{2} } ( | t | ^ \alpha + | s | ^ \alpha - | t - s | ^ \alpha ),\ \ 0 < \alpha \leq 1 , $$

then $ \phi ( x) $ is an upper function if and only if

$$ \int\limits _ { e } ^ \infty t ^ {n-} 1 K [ \phi ( t)] dt < \infty , $$

where

$$ K [ x] = x ^ {( 4n/ \alpha ) - 1 } e ^ {- x ^ {2} /2 } . $$

For almost all sample functions of a Gaussian random process to be analytic in a neighbourhood of a point $ t _ {0} $ it is necessary and sufficient that the covariance function $ R( t, s) $ be analytic in $ t $ and $ s $ in a neighbourhood $ | t - t _ {0} | < \delta $, $ | s - t _ {0} | < \delta $, $ \delta > 0 $.

References

[1] J.L. Doob, "Stochastic processes" , Chapman & Hall (1953)
[2] H. Cramér, M.R. Leadbetter, "Stationary and related stochastic processes" , Wiley (1967) pp. Chapts. 33–34
[3] Yu.K. Belyaev, "Continuity and Hölder's conditions for sample functions of stationary Gaussian processes" , Proc. 4-th Berkeley Symp. Math. Stat. Probab. , 2 , Univ. California Press (1961) pp. 23–33
[4] E.I. Ostrovskii, "On the local structure of Gaussian fields" Soviet Math. Dokl. , 11 : 6 (1970) pp. 1425–1427 Dokl. Akad. Nauk SSSR , 195 : 1 (1970) pp. 40–42
[5] M. Nisio, "On the continuity of stationary Gaussian processes" Nagoya Math. J. , 34 (1969) pp. 89–104
[6] R.M. Dudley, "Gaussian processes on several parameters" Ann. of Math. Statist. , 36 : 3 (1965) pp. 771–788
[7] X. Fernique, "Continuité des processus Gaussiens" C.R. Acad. Sci. Paris Sér. I Math. , 258 (1964) pp. 6058–6060
[8] M.I. Yadrenko, "Local properties of sample functions of random fields" Visnik Kiiv. Univ. Ser. Mat. Mekh. , 9 (1967) pp. 103–112 (In Ukrainian) (English abstract)
[9] T. Kawada, "On the upper and lower class for Gaussian processes with several parameters" Nagoya Math. J. , 35 (1969) pp. 109–132
[10] Yu.K. Belyaev, "Analytical random processes" Theory Probab. Appl. , 4 : 4 (1959) pp. 402–409 Teor. Veroyatnost. i Primenen. , 4 : 4 (1959) pp. 437–444
[11] E.E. Slutskii, "Qualche proposizione relativa alla teoria delle funzioni aluatorie" Giorn. Inst. Ital. Attuari , 8 : 2 (1937) pp. 183–199
[12] X.M. Fernique, "Regularité de trajectoires des fonctions aleatoires gaussiennes" J.P. Conze (ed.) J. Cani (ed.) X.M. Fernique (ed.) , Ecole d'Ete de Probabilité de Saint-Flour IV-1974 , Springer (1975) pp. 1–96

Comments

References

[a1] R.J. Alder, "The geometry of random fields" , Wiley (1981)
How to Cite This Entry:
Sample function. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Sample_function&oldid=48608
This article was adapted from an original article by Yu.K. Belyaev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article