# EDRM 711 Final

Home > Preview

The flashcards below were created by user tnrose87 on FreezingBlue Flashcards.

1. Backward Regression
Start with all predictors, remove the predictor with the smallest (k-1)th order squared semi-partial correlation (based on the number of predictors in the equation). If a significant change in R2 occurs, stop and use model with the last removed predictor.
2. Confounding Variable
When a variable affects two or more predictors, example age is a confounding variable when examining the relationship between math scores and height.
3. Semi-partial Correlations (r1(2.3))
The PPMCC between two variables from which the effects of one (or more) other variables have been removed (or partialled) from only one of the two variables. AKA The Uniqueness Index when squared (a predictor's unique contribution to R2)

The product with the largest Beta has the highest semi-partial.
4. Spurious Correlation
a correlation between two variables that does not result from any direct relation between them but from their relation to other variables. E.g. Height and math scores - the correlation appears because of the third variable, age. But when age is controlled for, the correlation disappears.
5. Partial Correlation (r12.3)
The PPMCC between two variables from which the effects of one or more other variables have been removed (i.e. partialled) from the two variables. Tells you the magnitude of the correlation.

• Notation is symmetric: r12.3 = r21.3
• ***Must take the square root from the SAS printout. ***
6. Suppressor Variable
A variable that is uncorrelated with one of the original variables, yet when it is partialled out, the apparent relationship between the two original variables increases.

E.g. Math computational scores vs. Math Word Problem Scores, reading is a suppressor variable because it affects the relationship between comp and word problem scores.
7. Detecting Multicollinearity
Tolerance: Lower values=higher multicollinearity (≤.2)

VIF: Higher values=higher multicollinearity (≥4)
8. Collinearity
When a predictor variable has a perfect or almost perfect relationship with other predictor variables, while ignoring the dependent variable.
9. Residual
• Actual Value - Predicted Value
• Also known as SS error
10. Stepwise Regression
Combination of Forward and Stepwise regression. Start with the single best predictor, add the best available predictor given what is already in the equation, if there is a significant change in R2, remove only the non-contributing predictors. Predictors need to be significant to get into and stay in model.
11. Model Building
- Based on Theory

- The process to find the least complicated and best fitting model
12. R2
SStotal-SSerror/SStotal
13. Outliers: DFBETA
Indicates change in b if observation is deleted.

Cut-off is 2/√n
14. Cook's D
Indicates influence by taking into account both the size of the error and the leverage.

Should be <1
15. Outliers: Studentized Resdiuals
t-value obtained by dividing error by its standard error

Cut off is >2
16. Regression Line
All of the y "hats" or prediceted y values fall on the line
17. SSerror
Residual SS or Unexplained SS

Deviation between observed and predicted value
18. SSmodel
• Explained SS or RegressionSS
• Deviation between predicted value and average of dependent value
19. SStotal
• Deviation between observed and average dependent value
• SStotal = Σ(y-ý)2
20. Assumptions of LR
• *The true conditional probabilities are a logistic function of the independent variables
• *No important variables are ommitted
• *No extraneous variables are included
• *The independent variables are measured without error
• *The observations are independent
• *The independent variables are not linear combinations of each other
21. LR: Probablity
pi (range: 0 to 1)
22. Model Effect Size
• R2L = (-2LLnull)-(-2LLk)/(=2LLnull)
• Explains proportion of null deviance accounted for by predictors
23. Model Fit
• χ2 = -2LLsmall-(-2LLlarge) <- Also critical value
• -2LL also known as deviance or misfit

***Want to see a drop in deviance***
24. Predicted Probabilities
predicted log(odds)=.2+.5x1+.1x2

predicted odds = e.2+.5x1+.1x2

• PA=1/1+e-(.2+.5x1+.1x2)
• Replace x1 and x2 w/ given values
25. LR: Odds Ratios
predicted log(odds)=.2+.5x1+.1x2

predicted odds = e.2+.5x1+.1x2

(1-OR)*100 = % Differences
26. LR: Loge(Odds)
LN(Odds) (range: -infinity to +infinity)
27. LR: Odds Ratio
Odds for Group A/Odds for Group B
28. LR: Odds
pi/1-pi (range: 0 to +infinity)
29. Forward Regression
Seeking the predictor with the biggest correlation coefficient. Start w/ the single best predictor and add the best available predictor given what is already in the equation to achieve a significant change in R2.

### Card Set Information

 Author: tnrose87 ID: 216881 Filename: EDRM 711 Final Updated: 2013-05-06 04:26:54 Tags: Stats Folders: Description: EDRM Final Exam Show Answers:

Home > Flashcards > Print Preview