Namespaces
Variants
Actions

Regression coefficient

From Encyclopedia of Mathematics
Jump to: navigation, search


A coefficient of an independent variable in a regression equation. For example, in the linear regression equation $ {\mathsf E} ( Y \mid X = x ) = \beta _ {0} + \beta _ {1} x $, connecting the random variables $ Y $ and $ X $, the regression coefficients $ \beta _ {0} $ and $ \beta _ {1} $ are given by

$$ \beta _ {0} = m _ {2} - \rho \frac{\sigma _ {2} }{\sigma _ {1} } m _ {1} ,\ \ \beta _ {1} = \rho \frac{\sigma _ {2} }{\sigma _ {1} } , $$

where $ \rho $ is the correlation coefficient of $ X $ and $ Y $, $ m _ {1} = {\mathsf E} X $, $ m _ {2} = {\mathsf E} Y $, $ \sigma _ {1} ^ {2} = {\mathsf D} X $, and $ \sigma _ {2} ^ {2} = {\mathsf D} Y $. The calculation of estimates for regression coefficients (sample regression coefficients) is a fundamental problem of regression analysis.

How to Cite This Entry:
Regression coefficient. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Regression_coefficient&oldid=48474
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article