Different neural network training result each time
15 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Morten
il 30 Set 2011
Commentato: Salma Hassan
il 2 Feb 2018
Hey
I am trying to implement a neural network with leave-one-out crossvalidation. The problem is when I train the network I get a different result each time.
My code is:
-------
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
net.divideFcn = '';
[net] = train(net,inputs,targets);
testOut = net(validation);
[c,cm] = confusion(validationTarget,testOut); %cm
TP = cm(1,1); FN = cm(1,2); TN = cm(2,2); FP = cm(2,1);
fprintf('Sensitivity : %f%%\n', TP/(TP+FN)*100);
fprintf('Specificity : %f%%\n\n', TN/(TN+FP)*100);
-----------
Is it because train() uses different proportions of the input data each time? In this case I have tried to avoid dividing data in training, validation and test by setting net.divideFcn = ''. I have also tried to set net.divideParam.trainRatio = 100/100.
I have tried to set EW = 1, but it does not change anything.
Any suggestions?
Morten
1 Commento
Greg Heath
il 3 Ott 2011
Terminology:
Data = DesignSet + TestSet
DesignSet = TrainingSet + ValidationSet
DesignSet: Used iteratively to determine final
design parameters (No. of hidden nodes,
No. of epochs, Weight values, etc)
TrainingSet: Used to estimate weights
ValidationSet: Iterative performance estimates used
to select final design parameters.
Generally, final validation performance
is biased because of iterative feedback
between validation and testing.
TestSet: Used once and only once to estimate
unbiased generalization performance (i.e.,
performance on unseen nondesign data).
If TestSet performance is unsatisfactory and additional
designing is desired, Data should be repartitioned to
mitigate feedback biasing.
There are several different ways to use cross validation
(XVAL). The most important principle is that final
performance estimate biasing can be mitigated by using
a test set that was in no way used to determine design
parameters.
Hope this helps.
Greg
Risposta accettata
Pawel Blaszczyk
il 30 Set 2011
Try to add this command on the beginning of a script:
RandStream.setDefaultStream(RandStream('mt19937ar','seed',1));
3 Commenti
Greg Heath
il 31 Gen 2018
I don't recommend using this code.
I'm sure you can find a better one in the NEWSGROUP or ANSWERS.
In fact, I don't even recommend f-fold XVAL for neural nets. It is much, much easier to just use multiple sets of random initial weights.
I have posted HUNDREDS of examples in both the NEWSGROUP and ANSWERS.
Greg
Più risposte (5)
Pawel Blaszczyk
il 30 Set 2011
because your net is preset with random values of gains so during the training you have different start point in each simulation. If you set always the same weights, you will always get the same answer. Function above sets the same seed every time, so the rand() sequence is always identical
2 Commenti
Greg Heath
il 3 Ott 2011
The only purpose for resetting the RNG with a
previous seed is to reproduce previous results.
It should not be reset during a XVAL experiment.
Resetting with a previous seed (even if the data
partition is different) violates the implicit
assumption of randomness.
Hope this helps.
Greg
faramarz sa
il 22 Ott 2013
Modificato: faramarz sa
il 22 Ott 2013
Different Matlab Neural networks toolbox results is because of two reasons: 1-random data division 2-random weight initialization
For different data division problem use function "divideblock" or "divideint" instead of "dividerand" like this:
net.dividefcn='divideblock;
net.divideparam.trainratio=.7;
net.divideparam.valratio=.15;
net.divideparam.testratio=.15;
For random weight initialization problem, It seems (I'm not sure) all Matlab initialization functions ("initzero", "initlay”, "initwb”, “initnw”) are almost random. So you should force this functions produce similar results per call.
RandStream.setGlobalStream (RandStream ('mrg32k3a','Seed', 1234));
And then use one of them:
net.initFcn='initlay';
net.layers{i}.initFcn='initnw';
0 Commenti
Greg Heath
il 3 Ott 2011
I suspect that similar results are obtained because the same RNG seed is used.
See my previous comments about not resetting the seed.
How large is your data set? I assume your trn/tst split is 50/50,and you are using 2-fold XVAL without a validation set.
See my previous comments on the difference between validation and testing.
Hope this helps.
Greg
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!