Correlation and Regression - PowerPoint PPT Presentation

About This Presentation
Title:

Correlation and Regression

Description:

Model fit. R square change. Descriptives. Part and partial correlations. Collinearity diagnostics ... Hosmer-Lemeshow goodness-of-fit ... – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 28
Provided by: michae46
Category:

less

Transcript and Presenter's Notes

Title: Correlation and Regression


1
Correlation and Regression
  • Its the Last Lecture Hooray!

2
Correlation
  • Analyze ? Correlate ? Bivariate
  • Click over variables you wish to correlate
  • Options ? Can select descriptives and pairwise
    vs. listwise deletion
  • Pairwise deletion only cases with data for all
    variables are included (default)
  • Listwise deletion - only cases with data for both
    variables are included

3
Correlation
  • Assumptions
  • Linear relationship between variables
  • Inspect scatterplot
  • Normality
  • Shapiro-Wilks W
  • Other issues
  • Range restriction Heterogenous subgroups
  • Identified methodologically
  • Outliers
  • Inspect scatterplot

4
(No Transcript)
5
Correlation
  • Partial Correlation removes variance from a 3rd
    variable, like ANCOVA
  • Analyze ? Correlate ? Partial

6
(No Transcript)
7
Regression
  • Analyze ? Regression ? Linear
  • Use if both predictor(s) and criterion variables
    are continuous
  • Dependent Criterion
  • Independent Predictor(s)
  • Statistics
  • Regression Coefficients (b ß)
  • Estimates
  • Confidence intervals
  • Covariance matrix

8
Regression
  • Statistics
  • Model fit
  • R square change
  • Descriptives
  • Part and partial correlations
  • Collinearity diagnostics
  • Recall that you dont want your predictors to be
    too highly related to one another
  • Collinearity/Mulitcollinearity when predictors
    are too highly correlated with one another
  • Eigenvalues of the scaled and uncentered
    cross-products matrix, condition indices, and
    variance-decomposition proportions are displayed
    along with variance inflation factors (VIF) and
    tolerances for individual predictors
  • Tolerances should be gt .2 VIF should be lt 4

9
Regression
  • Statistics
  • Residuals
  • Durbin-Watson
  • Tests correlation among residuals (i.e.
    autocorrelation) - significant correlation
    implies nonindependent data
  • Clicking on this will also display a histogram of
    residuals, a normal probability plot of
    residuals, and the case numbers and standardized
    residuals for the 10 cases with the largest
    standardized residuals
  • Casewise diagnostics
  • Identifies outliers according to pre-specified
    criteria

10
Regression
  • Plots
  • Plot standardized residuals (ZRESID) on y-axis
    and standardized predicted values (ZPRED) on
    x-axis
  • Check Normal probability plot under
    Standardized Residual Plots

11
Regression
  • Assumptions
  • Observations are independent
  • Linearity of Regression
  • Look for residuals that get larger at extreme
    values, i.e. if residual are normally distributed
  • Save unstandardized residuals
  • Click Save ? Under Residuals click
    Unstandardized when you run your regression,
  • Run a Shapiro-Wilks W test on this variable
    (RES_1)

12
Regression
  • Normality in Arrays
  • Examine normal probability plot of the residuals,
    residuals should resemble normal distribution
    curve
  • BAD GOOD

13
Regression
  • Homogeneity of Variance in Arrays
  • Look for residuals getting more spread out as a
    function of predicted value i.e. cone shaped
    patter in plot of standardized residuals vs.
    standardized predicted values
  • BAD GOOD

14
Regression Output
15
Regression Output
16
Regression Output
17
Logistic Regression
  • Analyze ? Regression ? Binary Logistic
  • Use if criterion is dichotomous no assumptions
    about predictor(s)
  • Use Multinomial Logistic if criterion
    polychotomous (3 groups)
  • Dont worry about that though

18
Logistic Regression
  • Assumptions
  • Observations are independent
  • Criterion is dichotomous
  • No stats needed to show either one of these
  • Important issues
  • Outliers
  • Save ? Influence ? Check Cooks and Leverage
    values
  • Cooks statistic outlier any variable gt
    4/(n-k-1), where n of cases k of
    predictors
  • Leverage values outlier anything gt .5

19
Logistic Regression
  • Multicollinearity
  • Tolerance and/or VIF statistics arent easily
    obtained with SPSS, so youll just have to let
    this one go ?
  • Options
  • Classification plots
  • Table of actual of Ss in each criterion group
    vs. predicted group membership Shows, in
    detail, how well regression predicted data

20
Logistic Regression
  • Options
  • Hosmer-Lemeshow goodness-of-fit
  • More robust than traditional ?2 goodness-of-fit
    statistic, particularly for models with
    continuous covariates and small sample sizes
  • Casewise listing of residuals
  • Helps ID cases with large residuals (outliers)

21
Logistic Regression
  • Options
  • Correlations of estimates
  • Just what it sounds like, correlations among
    predictors
  • Iteration history
  • CI for exp(B)
  • Provides confidence intervals for standardized
    logistic regression coefficient
  • Categorical
  • If any predictors are discrete, they must be
    identified here, as well as which group is the
    reference group (identified as 0 vs. 1)

22
Logistic Regression Output
23
Logistic Regression Output
24
Logistic Regression Output
25
Logistic Regression Output
26
Logistic Regression Output
27
Logistic Regression Output
  • Step number 1
  • Observed Groups and Predicted
    Probabilities
  • 32 ô
    ô
  • ó
    ó
  • ó
    ó
  • F ó
    ó
  • R 24 ô
    ô
  • E ó
    N ó
  • Q ó
    N ó
  • U ó
    NN ó
  • E 16 ô
    NNNN ô
  • N ó
    NNNN ó
  • C ó
    NNNN ó
  • Y ó
    N NNNN ó
  • 8 ô
    NNNNNN ô
  • ó
    N NNNNNN ó
  • ó
    N NN NNANNNN ó
Write a Comment
User Comments (0)
About PowerShow.com