0

In the book Generative AI with Python and TensorFlow 2 from Babcock and Bali (page 172), it is stated that the value function of a GAN is the following:

value function of GAN

where D(x) is the output of the discriminator and G(z) is the output of the generator. However I don't understand why there is a product of two logarithms. The value D(x) is supposed to be a probability, meaning that D(x) lies in the number interval between 0 and 1. Having that into account the log D(x) would be a negative number, so log log D(x) shouldn't exist because the log of a negative number doesn't exist.

Can anyone shed some light into this? Is the function wrong or is there anything I am missing?

Claudia P
  • 1
  • 1
  • Instead of writing "Query regarding...", can you please just put your **specific question** in the title? – nbro Apr 17 '23 at 12:00

1 Answers1

0

Assuming that there is no other trickery done in the book regarding notation, I'd say this value function is wrong. Minimizing log(log(x)) is similar to minimizing log(x). However, log(log(x)) indeed cannot take values lower than 1, which is what the discriminator would output.

Maybe someone else who has access to the book can clarify exactly how they come to the value function they use. Until then, I'd recommend simply thinking of the value function that is used in the original GAN paper (Goodfellow et al. 2014)

minGmaxD(D,G)=E[log(D(x))]+E[log(1D(G(x)))]

Robin van Hoorn
  • 1,810
  • 7
  • 32