Whats a quick way to work the below (and problems similar) out?
For me to take this derivative it involves a lot of time and boring calculation, there has to be a better way.
This is taken from the solutions to Chris Bishop's Pattern Recognition and Machine learning:
Contents
hide
Best Answer
There is something called the Matrix Cookbook, which includes a lot of identities and matrix derivatives. So if we look at eq. (88) of the Matrix Cookbook,
$$frac{partial}{partial A} (mathbf{x} -mathbf{A}mathbf{s})^Tmathbf{W}(mathbf{x} -mathbf{A}mathbf{s}) = -2mathbf{W}(mathbf{x}-mathbf{A}mathbf{s})mathbf{s}^T$$
we see that this directly refers to your problem, if we assume $Sigma^{-1}$ is a covariance matrix and therefore symmetric.
Similar Posts:
- Solved – Derivation of M-step in EM algorithm for mixture of Gaussians
- Solved – Computing the Hessian of maximum log likelihood function
- Solved – Derivative of Softmax with respect to weights
- Solved – How to prove that $X^T$e = 0
- Solved – How to calculate the inverse of sum of a Kronecker product and a diagonal matrix