5

I'm worrying that my neural network has become too complex. I don't want to end up with half of the neural network doing nothing but just take up space and resources.

So, what are the techniques for detecting and preventing overfitting, to avoid such problems?

nbro
  • 39,006
  • 12
  • 98
  • 176
kenorb
  • 10,423
  • 3
  • 43
  • 91
  • This question seems to be a superset of [this](https://ai.stackexchange.com/q/13706/2444). – nbro Dec 25 '21 at 18:23

1 Answers1

4
  1. Usually you keep track of training loss and validation loss and apply proper regularization technique (such as L1, L2, dropout, DropConnect, etc.).

  2. The more interesting technique is to observe your validation loss with respect to the number of parameters in the network (often controlled by the number of layers/feature maps). If the validation starts dropping with raising your model's complexity, then your optimization might be bad or simply the model remembers all of the training samples and overfits badly.

nbro
  • 39,006
  • 12
  • 98
  • 176
FunkyKowal
  • 341
  • 2
  • 7