how it is possible to have overfitting before the network learn properly?

1 visualizzazione (ultimi 30 giorni)
my question is when my network has a performance about 90 - 98% i mean my learning error is about 98%,(I suppose such performance means my net didn't learn anything yet), how it is possible that my net stops training due to early stopping point?

Risposta accettata

Greg Heath
Greg Heath il 21 Dic 2014
Modificato: Greg Heath il 20 Feb 2015
Poorly worded question
Are we supposed to guess
1. That you are referring to a classifier ?
2. Which MATLAB function you are using ... patternnet ?
3. The number of classes c ?
4. The dimensionality of the inputs I ?
5. The number of hidden nodes H ?
6. The trn/val/tst ratio 0.7/0.15/0.15 ?
Overfitting only means that you have more unknown weights than training equations
Nw > Ntrneq
where
Ntrneq = Ntrn*c
Nw = (I+1)*H+(H+1)*c
Validation stopping has nothing to do with training data performance. It has to do with
OVERTRAINING AN OVERFIT NET
It means that the training has reached the point where validation set performance (mse or cross-entropy) has undergone a local minimum indicating that if you don't stop, you will have over-trained an over-fit net to the point where further training will probably make the net perform worse on val, tst and unseen non-training data.
Remember:
The goal of design is to use training data to obtain a net that works well on all nontraining data:
validation + test + unseen
Hope this helps.
Thank you for formerly accepting my answer
Greg

Più risposte (0)

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by