0

Recently, I ran a code on my system that involves deep neural networks. The number of epochs provided by the designers are 301.

I tried to increase the number of epochs to 501. To my shock, the model after 350 epochs is behaving eccentric. And I can say that they are just returning crazy values.

What makes such phenomena possible? Is "number of epochs" also a hyperaparameter/ magic number as an upper bound beyond which the model fails?

Mithical
  • 2,885
  • 5
  • 27
  • 39
hanugm
  • 3,571
  • 3
  • 18
  • 50

1 Answers1

1

There is nothing specific about this particular numbers. Everything depends on the NN, software, model and data. As illustrated as the number of epoch increases, more number of times the weights are changed and the curve goes from underfitting to overfitting. And overfitting is exactly eccentric and crazy, see the picture at left:.

enter image description here

  • OP is asking about the learning curve (dynamics of the loss function and metrics with time), not the overfitting (to close alignment of the curve to the data). – spiridon_the_sun_rotator Sep 09 '21 at 04:05
  • @spiridon_the_sun_rotator, what makes you think so? Any prove with a quote from the question text? Anyway, it would be better if the author clarifies. – Damir Tenishev Sep 09 '21 at 10:59
  • he says, that after some time, the network starts to predict strange values - actually, the point to be clarified is whether it is on unseen data, or on any sample. Increase of number of epochs doesn't lead usually to rapid changes in the loss - the common phenomenon is the improvement on the train set, and deterioration (slow) on the test or validation. – spiridon_the_sun_rotator Sep 09 '21 at 12:13
  • Where does he say "after some time"? Where is "about the learning curve"? Where is "dynamics of the loss function and metrics *with time*". He says that the model provides crazy values when he adjusts the number of echoes. That's it. Let's listen to @hanugm and avoid speculations. – Damir Tenishev Sep 09 '21 at 12:24