This question I think was already asked here but I can't fully understand the answer.
I have a number of ordinal predictors that I'm transforming into dummy variables and I'm wondering whether the hierarchical multiple regression linear relationship assumption (linear relationship between each predictor and the outcome variable – also the composite and outcome) needs to be met for each dummy variable?
There's nothing to check!
Linearity is automatically met for binary (/dummy) variables.
However you set them up, the IV (x) takes only two values (say 0 and 1 but it doesn't actually matter in any substantive way as long as they're any two distinct values). If the DV (y) has a different mean at those two values, the coefficient measures that difference, and that difference corresponds precisely to the term entering the model linearly — the slope on a 0/1 variable is the mean difference.
If the values differ by something other than 1, then the slope will change but the resulting mean change will be the coefficient times the change in the dummy (that is, the linear model always picks up exactly the mean difference).
- Solved – Standardizing dumthe variable in multiple linear regression
- Solved – why correlation coefficient is an indicator of a linear relationship?
- Solved – How to prove linearity assumption in regression analysis for a continuous dependent and nominal independent variable
- Solved – How does the correlation coefficient differ from regression slope
- Solved – When to look at the P value of the slope and when at the P value of a cor.test()