1

I've been trying to find the optimal number of epochs that I should train my neural network (that I just implemented) for.

The visualizations below show the neural network being run with a variable number of epochs. It is quite obvious that the accuracy increases with the number of epochs. However, at 75 epochs, we see a dip before the accuracy continues to rise. What is the cause of this?

enter image description here

nbro
  • 39,006
  • 12
  • 98
  • 176
eGood
  • 11
  • 2
  • You really need to provide more details about your problem, neural network and data. Is your model a simple multi-layer perceptron (i.e. not recurrent connections or convolutions)? How many layers does it have (and how many neurons per layer)? If you are using the MSE loss, I suppose that you are using a regression problem, but you're using the accuracy... So, which problem exactly? How many data points do you have? Please, edit your post to include all these details. – nbro Nov 05 '20 at 11:30

1 Answers1

1

Decrease of loss does not essentially lead to increase of accuracy (most of the time it happens but sometime it may not happen). To know why, you can have a look at this question. The network cares about decreasing the loss and it does not care about the accuracy at all. So it's no surprise to see what you presented.

Additional note: If you use batch approaches to teach your network, or if you choose a big step size you may also see that the loss increases sometimes but in the case of batching, the hope is that its trend is to be decreased.

amin
  • 420
  • 2
  • 10