For questions related to the Jensen–Shannon divergence, which is a measure of the similarity between two probability distributions. The JS divergence is based on the Kullback–Leibler divergence.
Questions tagged [jensen-shannon-divergence]
2 questions
5
votes
1 answer
Why is the Jensen-Shannon divergence preferred over the KL divergence in measuring the performance of a generative network?
I have read articles on how Jensen-Shannon divergence is preferred over Kullback-Leibler in measuring how good a distribution mapping is learned in a generative network because of the fact that JS-divergence better measures distribution similarity…

ashenoy
- 1,409
- 4
- 18
0
votes
0 answers
Why did the authors of D2GAN propose two discriminators using KL and reverse KL divergence losses instead of one discriminator using JS divergence
I have stumbled upon the D2GAN paper as part of my research and I am finding myself extremely confused by the fact that instead of using the JS divergence to capture both KL and its reverse, they opted for two discriminators with opposite objective…

Minuano
- 101
- 1