I've run a MLM using lmer in R on some data that I have. As I'm analysing binary data, I've used a binomial MLM as well. I basically have a simple question about how to report the coefficients from the model.
I've read that the best thing to do with the binomial coefficients is to back-transform them using the invlogit function in order to make the coefficients more human-readable. The issue with this, and this probably seems like something stupid but I want make sure, is how to include the sign of the coefficient when using the invlogit function.
So for example, if I have a coefficient of -2, then I'm not sure whether to transform using invlogit while including the sign or not. It means that:
Gives an answer of:
However, if we include the sign:
Gives an answer of:
So, with that in mind, should I ignore the sign of the coefficient when using invlogit and then re-apply the sign afterwards, or should I keep it in when doing invlogit? I assume the answer is the former option, given that the pattern of my data makes more sense that way, but since this is the first time I've done MLMs of this type with binomial data, I'd like to make 100% certain before submitting a paper that's based on this output!
First of all, the coefficients of the model are differences between logits (not logits).
If you want to transform these coefficients into differences measured in probabilities, the algorithm depends on the structure of your model. I recommend producing a table of mean values (measured in logits) based on the coefficients of the model (this also includes the intercept). Then, choose the conditions you want to compare, calculate inverse of the corrsponding logits. Finally, calculate the difference between these values. The result is the difference between two conditions measured in probability differences.
On transforming logits into probabilities:
You have to keep the sign when calculating the inverse of the logit transformation.
The results of the inverse transformation are symmetric around a probability of $0.5$. If we transform a probability of $0.5$ to logits, we obtain a value of $0$. Lower probabilities are indicated by negative logits, higher probabilities are idicated by positive logits.
Hence, the inverse transformation is sensitive to the sign of the logit. Negative logits will result in probabilities below $0.5$
In your example, you can also see the symmetry of the logit values: The absolute difference between a probability of $0.5$ and the probabilities you obtained with the inverse transformations of
-2, respectively, is identical:
(0.5 - 0.8807971) # -0.3807971 (0.5 - 0.1192029) # 0.3807971
This curve illustrates the relation between probabilities and logits:
curve(binomial()$linkinv(x), from = -5, to = 5, xlab = "logit", ylab = "probability")
- Solved – Is an implementation of a density function for a logit-normal distribution available in R
- Solved – Correct interpretation of confidence interval for logistic regression
- Solved – Log Likelihood for GLM
- Solved – Coefficients for every group in ordered logistic regression (polr) in R
- Solved – How to interpret the following GAN training losses