Why does the neural network training end before reaching the specified maximum number of epochs?

8 visualizzazioni (ultimi 30 giorni)
Why does the neural network training end before reaching the specified maximum number of epochs?
This is how I am setting the training option: 
options = trainingOptions('sgdm', 'MiniBatchSize',miniBatchSize,'MaxEpochs',4000)
But, it looks like the training ended without reaching the max epoch. Is this normal? And what will actually affect the total epoch number in the training? 

Risposta accettata

MathWorks Support Team
MathWorks Support Team il 18 Ago 2021
Modificato: MathWorks Support Team il 7 Set 2021
There are many parameters that can cause a neural network to stop training. 
As you may know, an epoch is the full pass of the training algorithm over the entire training set. In general, the training will stop before reaching the specified maximum number of epochs to avoid overfitting to the data, thus improving the network generalization. That is, the training will stop if the results of the cross validation are not getting any better (within some tolerance).
Please refer to the following link for more information on the early stopping behavior to improving generalization of the network:
 

Più risposte (0)

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by