← Back to Blog
🇬🇧 English

Binary Logistic Regression in SPSS: Step-by-Step Guide

Binary Logistic Regression in SPSS: Step-by-Step Guide

When to Use Logistic Regression

When your dependent variable is binary (yes/no, diseased/healthy, pass/fail), linear regression is inappropriate. Binary logistic regression models the probability of one outcome category as a function of one or more predictor variables. It is widely used in health sciences, social research, and education.

Advantages of Logistic Regression

Running Logistic Regression in SPSS

Go to Analyze → Regression → Binary Logistic.

  1. Move the binary outcome variable to Dependent.
  2. Move predictors to Covariates. For categorical predictors, click Categorical to define dummy coding.
  3. Method: Enter (all predictors simultaneously) or Forward LR (stepwise).
  4. Options: Check Hosmer-Lemeshow goodness of fit and CI for exp(B).

Interpreting the Output

Omnibus Tests: Tests whether the model significantly improves prediction over the null model (p<0.05 needed). Nagelkerke R²: Pseudo R-squared indicating model fit; 0.20=weak, 0.40=moderate, 0.60=strong. Hosmer-Lemeshow test: p>0.05 indicates adequate model fit. Variables in the Equation: B coefficient, Wald statistic, p-value, and crucially Exp(B) = Odds Ratio.

Interpreting the Odds Ratio

OR=1.0 → no effect. OR>1.0 → the predictor increases the odds of the outcome. OR<1.0 → the predictor decreases the odds. If the 95% CI for OR does not include 1.0, the result is statistically significant.

APA Reporting Example

Binary logistic regression identified age (OR=1.08, 95% CI [1.03, 1.14], p=.003) and education level (OR=2.34, 95% CI [1.42, 3.86], p=.001) as significant predictors of disease risk. The model demonstrated adequate fit (Hosmer-Lemeshow χ²(8)=6.21, p=.62) and explained 34% of variance (Nagelkerke R²=.34).

Professional Statistics Consulting

Expert SPSS analysis, academic visualization, and research consulting services.

WhatsApp Contact →