Recently, I ran a code on my system that involves deep neural networks. The number of epochs provided by the designers are 301.
I tried to increase the number of epochs to 501. To my shock, the model after 350 epochs is behaving eccentric. And I can say that they are just returning crazy values.
What makes such phenomena possible? Is "number of epochs" also a hyperaparameter/ magic number as an upper bound beyond which the model fails?