3

Whenever I tune my neural network, I usually take the common approach of defining some layers with some neurons.

  • If it overfits, I reduce the layers, neurons, add dropout, utilize regularisation.

  • If it underfits, I do the other way around.

But it sometimes feels illogical doing all these. So, is there a more principled way of tuning a neural network (i.e. find the optimal number of layers, neurons, etc., in a principled and mathematical sound way), in case it overfits or underfits?

nbro
  • 39,006
  • 12
  • 98
  • 176
Fasty
  • 151
  • 5
  • 1
    [How is neural architecture search performed?](https://ai.stackexchange.com/q/12434/2444) may be helpful. –  Mar 18 '20 at 22:17

0 Answers0