I am reading the second edition of Categorical Data Analysis by Alan Agresti, and somehow stuck in the following second paragraph:
I don't quite understand why $betapi(hat{x})(1 – pi(hat{x}))$ will give the probability when $x = 26.3$, can anyone enlighten me? Thanks.
Contents
hide
Best Answer
The answer is near the bottom of p166. It's using a linear approximation (what social scientists would call a 'marginal effect'). A small change $delta x$ in $x$ gives a change in probability of: $$deltapi approx frac{partial pi(x)}{partial x} delta x.$$ With $operatorname{logit}(pi(x)) = alpha + beta x$, it's straightforward to show that $ partial pi(x) / partial x = beta pi(x)(1-pi(x))$.
Similar Posts:
- Solved – Alternative ways for interpretation of odds
- Solved – Differentiation of Cross Entropy
- Solved – Differentiation of Cross Entropy
- Solved – How to derive the recursive equation for back propagation for neural networks for $delta_j = frac{partial E_n}{ partial a_j} $
- Solved – Beginner book on logistic regression