The Geometry of Enhancement in Multiple Regression |
| |
Authors: | Niels G Waller |
| |
Institution: | 1.Department of Psychology,University of Minnesota,Minneapolis,USA |
| |
Abstract: | In linear multiple regression, “enhancement” is said to occur when R
2=b′r>r′r, where b is a p×1 vector of standardized regression coefficients and r is a p×1 vector of correlations between a criterion y and a set of standardized regressors, x. When p=1 then b≡r and enhancement cannot occur. When p=2, for all full-rank R
xx≠I, R
xx=Exx′]=V
Λ
V′ (where V
Λ
V′ denotes the eigen decomposition of R
xx; λ
1>λ
2), the set B1:={bi:R2=bi¢ri=ri¢ri;0 < R2 £ 1}\boldsymbol{B}_{1}:=\{\boldsymbol{b}_{i}:R^{2}=\boldsymbol{b}_{i}'\boldsymbol{r}_{i}=\boldsymbol{r}_{i}'\boldsymbol{r}_{i};0R2 £ 1;R2lp £ ri¢ri < R2}0p≥3 (and λ
1>λ
2>⋯>λ
p
), both sets contain an uncountably infinite number of vectors. Geometrical arguments demonstrate that B
1 occurs at the intersection of two hyper-ellipsoids in ℝ
p
. Equations are provided for populating the sets B
1 and B
2 and for demonstrating that maximum enhancement occurs when b is collinear with the eigenvector that is associated with λ
p
(the smallest eigenvalue of the predictor correlation matrix). These equations are used to illustrate the logic and the underlying
geometry of enhancement in population, multiple-regression models. R code for simulating population regression models that exhibit enhancement of any degree and any number of predictors is included
in Appendices A and B. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|