The divergence between two probability distributions is used in calculating the difference between the true distribution and generated distribution. These divergence metrics are used in loss functions.
Some divergence metrics that are generally used in literature are:
Some other divergence measures include:
- Squared Hellinger distance
- Jeffreys divergence
- Chernoff's $\alpha-$divergence
- Exponential divergence
- Kagan's divergence
- $(\alpha, \beta)-$product divergence
- Bregman divergence
I think some naive divergence measures include
- Least-squares divergence
- Absolute deviation
Along with these, are there any other divergence measures available to compute the distance between the true probability distribution and estimated probability distribution in artificial intelligence?