Let's say we have a WGAN where the generator and critic have 8 layers and 5 million parameters each. I know that the greater the number of training samples the better, but is there a way to know the minimum number of training examples needed? Does it depend on the size of the network or the distribution of the training set? How can I estimate it?
Asked
Active
Viewed 46 times
1

nbro
- 39,006
- 12
- 98
- 176

FalseSemiColon
- 11
- 1
-
Hello. To clarify, are you only interested in WGAN? In any case, it seemed to me that was the case, so I tried to clarify that in your post. So, are you also just interested in the task of learning a probability distribution to generate samples? In other words, which machine learning task are you interested in? It may also be a good idea to be more specific when you say "interesting results", although, in this case, this detail should be obvious (although dependent on the task and performance measure). – nbro Feb 24 '21 at 15:29
-
Sorry, I just started with machine learning less that a month ago. With interesting I mean like what WDCGANs usually do, generate elements that look similar to the samples without being the same. I made the quiestion thinking about GANs but if someone gives guidelines for any ANN, that would be good. – FalseSemiColon Feb 24 '21 at 19:06