It seems to be common to take a "high" condition number as a sign for multicolinearity in regression analysis. For linear models I'm totally convinced that this is a good idea, but is there any analysis on how colinearity influences the condition number of logistic regressio models?

I can't find any and simple simulations seem to indicate that there is no monotonically increasing relationship between the two.

Would be great if anyone could point out a reference or give a short analysis of this!

**Contents**hide

#### Best Answer

This UCLA page covers this issue in the context of logistic regression. But the phrasing of this question suggests some misunderstanding of multicollinearity and condition numbers.

The condition number and multicollinearity are functions of the design matrix of independent variable values. They bear no relation to the dependent variable or to the type of regression. A high condition number or multicollinearity means that some of the predictor variables are close to being linear combinations of each other. Thus in *any* linear modeling there will be ambiguity in determining which is the "true" predictor variable among a set of collinear variables. It doesn't matter whether the regression is linear, logistic, or any other type of generalized linear model.

### Similar Posts:

- Solved – Multicolinearity and Condition number of logistic regresison
- Solved – Multicolinearity and Condition number of logistic regresison
- Solved – VIF calculation in regression
- Solved – Checking for multicollinearity in a negative binomial regression model
- Solved – Using Mutual Information for Binary Logistic Regression Variable Selection