newnarx initialization and validation problem

4 visualizzazioni (ultimi 30 giorni)
Dear all,
I am working on the problem: y(k)=0.3*y(k-1)+0.6*y(k-2)+u(k)^3+0.3*u(k)^2-0.4*u(k) using narx. the code is listed below.
net = newnarx(u,yn,0,1:2,[15 10],{'tansig','tansig','purelin'},'trainscg');
net.trainParam.lr = 0.05; net.trainParam.lr_inc = 1.05;
net.trainParam.lr_dec = 0.7; net.trainParam.hide = 50;
net.trainParam.mc = 0.9; net.trainParam.epochs = s;
net.trainParam.goal = 1e-8; net.trainParam.time = 5*3600; [trainP,valP,testV,trainInd,valInd,testInd] = divideblock(u,0.6,0.2,0.2);
[trainT,valT,testT] = divideind(yn,trainInd,valInd,testInd);
net.divideFcn='divideblock';
net = init(net);
My training stops after 6 validation checks. Why? The trained net gives different outputs each time. Is there any mistake in the initialization?
Please help.

Risposta accettata

Greg Heath
Greg Heath il 27 Ott 2012
See my previous post re your last program. Most of the comments are relevant for this post.
In particular, do not overwrite defaults unless you have a darned good reason.
Also, it is very seldom that a 2nd hidden layer is necessary.
Validation stopping was created to make sure the trained net is useful for nontraining data. Stopping after MSEval increases for 6 consecutive epochs is reasonable.
You will get different results every time because the initial weights are random.
For each different setting of delays and hidden nodes I usually loop over Ntrials = 10 designs.
You would be surprised how many times only 7 or 8 out of 10 are acceptable.
However, I do initialize the random number generator before the double loop over hidden nodes and random weight initializations.
Hope this helps.
Thank you for formally accepting my answer.
Greg

Più risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Tag

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by