The flashcards below were created by user
MRK
on FreezingBlue Flashcards.

Simple v. Multiple regression
 Simple: 1 axis
 Multiple: more than 1
 Assuming the relationship is linear

Smoothing curve v. Regression line
 Smoothing curve: Not straight, fits a curve using a given percentage of the points
 Regression Line: straight line using a statistical method

r^{2} value (coefficient of multiple determination)
 how well the model fits
 how much of the variation is y is explained by x

adjusted r^{2}
adjusts r^{2} by dividing each sum of squares by its associated degrees of freedom

residual
observedpredicted (with the line)

Why do we use simple linear regression?
 understand cause and effect relationships
 make decisions on cost
 to predict outcomes

case or data point
observed pairs of explanatory x, response y variables

Y_{i} = β_{0} + β_{1}X_{i} + ε_{i}
for i (1 to n)
 Simple linear regression model
 β_{0} = intercept
 β_{1} = slope
 ε = independent, normally distributed random errors with mean 0 and variance σ^{2}

ε ~^{iid} N(0,σ^{2})
iid = independent

how to find estimates for beta 1 and beta 0
 from plotting the regression line or
 Analytical procedure:
 b_{1} = Sum(X_{i}  mean X)(Y_{i}  mean Y) / sum(Xi  mean X)^{2}^{}
 b_{0} = mean Y  b_{1} times mean X

Maximum Likelihood
A way to find estimators b1 and b0 but they are the same for simple linear regressions

to calc b_{1} by hand with points
 b_{1} = sum(k_{i} times Y_{i})
 K_{i} = (X_{i}  mean X)/sum(X_{i}  mean X)^{2}

