The question of deriving the KL divergence of two Gaussians appears frequently. It arises especially often in the context of variational auto-encoders, because the essential paper on the topic by Kingma and Welling, "Auto-Encoding Variational Bayes", provided only the barest details about the result. Naturally, people will ask how to obtain that result.
So we have a few versions of this identical question asked.
- VAE derivation for Gaussian case
- Solution of $\int p_\theta(z) \log q(z) dz $ of Gaussian case
- KL-Divergence of $Q(z|X)$ and $P(z)$ in Variational Autoencoder (VAE)
- Deriving the KL divergence loss for VAEs
- KL divergence between two univariate Gaussians
There are, I assume, many more examples. This isn't great, because a goal of the site is to produce canonical answers to statistical questions. https://meta.stackoverflow.com/questions/291992/what-is-a-canonical-question-answer-and-what-is-their-purpose
- Do we have a canonical Q&A that we can use as a one-stop duplicate target?
- If not, what should a canonical Question address?
- Should we merge any of these Questions?
- Are there additional Questions that we should add to this list?
Note that marking as duplicate is a simple action & easily reversed. On the other hand, merging threads is permanent and irreversible, so we are hesitant to take that action unless we're absolutely certain that the questions are exactly the same—all of the answers to one are answers to the other and vice versa.