Are there any software libraries that compute KL divergences in closed form, that also give the derivatives of the KL divergence wrt the distributions' parameters? I'm using Julia, so it's particularly straightforward for me to call Julia, Fortran, C and C++ libraries.
Alternatively, if libraries like this don't exist, is there something I can do that would be easier than manually coding the KL divergences, and perhaps using automatic differentiation? I have to compute KL divergences for about 10 pairs of distribution with closed-form KL divergences, e.g. beta/beta, log-normal/log-normal, mv-normal/mv-normal, wrapped-cauchy/uniform.
I ended up coding KL divergences and derivatives myself in Julia. I've released it as part of an existing open source project. Future readers may find the code at this page of the Celeste.jl project.