The number of epochs is not that significant. More important is the the validation and training error. As long as it keeps dropping training should continue. For instance, if the validation error starts increasing that might be a indication of overfitting. You should set the number of epochs as high as possible and terminate training based on the error rates.
Just to be clear, an epoch is one learning cycle where the learner sees the whole training data set. If you have two batches, the learner needs to go through two iterations for one epoch.
If you have enough data, you can try Early Stopping method: divide data in three data sets, training, validation and evaluation. Train each network along a sufficient number of epochs to see the training Mean Squared Error to be stuck in a minimum. The training process uses training data-set and must be executed epoch by epoch, in order to calculate the Mean Squared Error of the network in each epoch for the validation set. The network for the epoch with the minimum validation MSE is selected for the evaluation process.
Well, the correct answer is the number of epochs is not that significant. more important is the validation and training error. As long as these two error keeps dropping, training should continue.
For instance, if the validation error starts increasing that might be an indication of overfitting.
You should set the number of epochs as high as possible and terminate the training when validation error start increasing。
REF
https://www.researchgate.net/post/How_to_determine_the_correct_number_of_epoch_during_neural_network_training
https://www.researchgate.net/post/How_does_one_choose_optimal_number_of_epochs
https://medium.com/@upendravijay2/what-is-epoch-and-how-to-choose-the-correct-number-of-epoch-d170656adaaf