What Is Multiple Regression Analysis?
Multiple linear regression examines how two or more independent variables collectively predict a continuous dependent variable, and quantifies each predictor's unique contribution. Example: How well do age, education level, and work experience predict job performance?
Regression Assumptions
- Linearity: The relationship between predictors and outcome must be linear. Check with scatterplots.
- Normality of residuals: Residuals should be approximately normally distributed. Use Normal P-P Plot.
- Homoscedasticity: Residual variance should be constant across fitted values. Check the residuals vs. predicted scatterplot.
- No multicollinearity: Predictors should not be highly correlated with each other. VIF<10, Tolerance>0.10.
- Independence: Residuals must be independent. Durbin-Watson value should be between 1.5–2.5.
Running the Analysis in SPSS
Go to Analyze → Regression → Linear.
- Move the dependent variable to Dependent.
- Move predictors to Independent(s).
- Method: Enter (forced entry) or Stepwise.
- Statistics: R squared change, Descriptives, Collinearity diagnostics.
- Plots: ZRESID vs ZPRED and Normal P-P Plot for assumption checking.
Reading the Output
Model Summary: R² shows the proportion of variance in the dependent variable explained by predictors. Adjusted R² is preferred when comparing models with different numbers of predictors. ANOVA table: Tests whether the model as a whole is significant (p<0.05 required). Coefficients table: Standardized beta (β) shows each predictor's relative contribution; p-value indicates statistical significance.
APA Reporting Example
Multiple regression analysis revealed that the model was statistically significant, F(3, 196)=28.74, p<.001, accounting for 30% of the variance in job performance (R²=.30, adjusted R²=.29). Work experience (β=.42, p<.001) and education level (β=.28, p=.003) emerged as significant predictors.
