**Regression** is a procedure for predicting the value of
one metric (continuous) variable based on the value of another.

A **regression equation** involves a predictor variable (indepdent
variable) and a criterion variable (dependent variable).

A **regression line** can be drawn which minimizes deviation
(this is the least squares criterion).

This line has a **slope** ("b") and a point where
it cross the y axis (the y **intercept** "a").

Regression equation:

Y = bX + a (+ error)Y =

predictedscore on criterion variableX = score on

predictorvariable

**Multiple regression** uses **multiple** independent
(predictor) variables to predict one dependent (criterion) variable

It yields a multiple correlation coefficient (**multiple R**)

- Multiple R is
**highest**(the prediction is better) when the predictor variables have low correlation with each other, but high correlations with the criterion. - If predictors are highly correlated with each other, combining
them yields no new information. Predictor overlap is called
**multicolinearity**. - Multiple R is
**never lower**than the highest simple correlation between an individual predictor and the criterion. - Multiple R can be
**squared**to indicate the proportion of variance accounted for by all the predictor variables together.R-square = coefficient of determination

An F value can be calculated:

F = SS regression (df)/SS average (df)