1

The divergence between two probability distributions is used in calculating the difference between the true distribution and generated distribution. These divergence metrics are used in loss functions.

Some divergence metrics that are generally used in literature are:

  1. Kullback-Leibler Divergence
  2. Jensen–Shannon divergence
  3. f-divergence
  4. Wasserstein distance

Some other divergence measures include:

  1. Squared Hellinger distance
  2. Jeffreys divergence
  3. Chernoff's $\alpha-$divergence
  4. Exponential divergence
  5. Kagan's divergence
  6. $(\alpha, \beta)-$product divergence
  7. Bregman divergence

I think some naive divergence measures include

  1. Least-squares divergence
  2. Absolute deviation

Along with these, are there any other divergence measures available to compute the distance between the true probability distribution and estimated probability distribution in artificial intelligence?

hanugm
  • 3,571
  • 3
  • 18
  • 50
  • You're already listing many measures. Probably all the major ones used in the literature that I know too. Anyway, keep in mind that our site is not appropriate for asking questions that require very long answers that require a lot of effort and time to write. So, it may be a good idea to reformulate your question that you provide more details about **why** you're interested in more measures and maybe ask for a reference that covers them (rather than asking for us to enumerate them), if you're interested in an extensive list. In that case, you may want to use the tag [tag:reference-request]. – nbro Aug 08 '21 at 12:29
  • 1
    Having said that, in [this answer](https://ai.stackexchange.com/a/25167/2444), I provide a link to a paper that talks about the topic, but I don't remember which measures are listed there (maybe you already list all the ones mentioned there too). – nbro Aug 08 '21 at 12:30
  • 1
    @nbro Thanks... I rephrased, I think now even any single metric is eligible as an answer. – hanugm Aug 08 '21 at 13:44

0 Answers0