Azzera filtri
Azzera filtri

NARXNET: Validation stop

1 visualizzazione (ultimi 30 giorni)
Christian Hofstetter
Christian Hofstetter il 25 Ago 2017
Commentato: Greg Heath il 13 Set 2017
Hi,
I am using this code to fit the data in the appendix. Since this data set is huge [1x984468] I'm trying to fit a smaller subset of it (1:20000) in the first place. The problem is that I just get "Validation stops" and never reach the "min gradient" even though I'm not exceeding Hmax.
I tried the code on the nndata set "valve_dataset", as well -> results are shown below.
Can anyone see the problem?
Thanks a lot!
plt = 0;
tic
% x = u1(1:20000);
% t = y1(1:20000);
% X = con2seq(x);
% T = con2seq(t);
[X,T] = valve_dataset;
x = cell2mat(X);
t = cell2mat(T);
[ I N ] = size(X);
[ O N ] = size(T);
MSE00 = mean(var(t',1))
MSE00a = mean(var(t',0))
% Normalization
zx = zscore(cell2mat(X), 1);
zt = zscore(cell2mat(T), 1);
Ntrn = N-2*round(0.15*N);
trnind = 1:Ntrn;
Ttrn = T(trnind);
Neq = prod(size(Ttrn));
%%Determine significant lags
%{
plt=plt+1,figure(plt)
subplot(211)
plot(t)
title('SIMPLENAR SERIES')
subplot(212)
plot(zt)
title('STANDARDIZED SERIES')
rng('default')
n = randn(1,N);
L = floor(0.95*(2*N-1))
for i = 1:100
autocorrn = nncorr( n,n, N-1, 'biased');
sortabsautocorrn = sort(abs(autocorrn));
thresh95(i) = sortabsautocorrn(L);
end
sigthresh95 = mean(thresh95) % 0.2194
autocorrt = nncorr(zt,zt,N-1,'biased');
siglag95 = -1+ find(abs(autocorrt(N:2*N-1))>=sigthresh95);
plt = plt+1, figure(plt)
hold on
plot(0:N-1, -sigthresh95*ones(1,N),'b--')
plot(0:N-1, zeros(1,N),'k')
plot(0:N-1, sigthresh95*ones(1,N),'b--')
plot(0:N-1, autocorrt(N:2*N-1))
plot(siglag95,autocorrt(N+siglag95),'ro')
title('SIGNIFICANT SIMPLENAR AUTOCORRELATIONS')
%INPUT-TARGET CROSSCORRELATION
%
crosscorrxt = nncorr(zx,zt,N-1,'biased');
sigilag95 = -1+ find(abs(crosscorrxt(N:2*N-1))>=sigthresh95); %significant feedback lag
%
plt = plt+1, figure(plt)
hold on
plot(0:N-1, -sigthresh95*ones(1,N),'b--')
plot(0:N-1, zeros(1,N),'k')
plot(0:N-1, sigthresh95*ones(1,N),'b--')
plot(0:N-1, crosscorrxt(N:2*N-1))
plot(sigilag95,crosscorrxt(N+sigilag95),'ro')
title('SIGNIFICANT INPUT-TARGET CROSSCORRELATIONS')
%}
FD = 1:1; %Random Selection of sigflag subset
ID = 1:2; %Random selection of sigilag subset crosscorrelation
NFD = length(FD);
NID = length(ID);
MXFD = max(FD);
MXID = max(ID);
Ntrneq = prod(size(t));
Hub = -1+ceil( (Ntrneq-O) / ((NID*I)+(NFD*O)+1));
Hmax = floor(Hub/50);
Hmin = 0;
dh = 1;
Ntrials = 10;
j = 0;
rng(0)
for h = Hmin:dh:Hmax
fprintf(['_____________H %','d/%d_____________\n'],h,Hmax)
j = j+1
if h == 0
net = narxnet( ID, FD, [] );
Nw = ( NID*I + NFD*O + 1 )*O
else
net = narxnet( ID, FD, h );
Nw = ( NID*I + NFD*O + 1 )*h + ( h + 1 )*O
end
Ndof = Ntrn-Nw
[ Xs Xi Ai Ts ] = preparets( net,X,{},T );
ts = cell2mat(Ts);
xs = cell2mat(Xs);
MSE00s = mean(var(ts',1))
MSE00as = mean(var(ts'))
MSEgoal = max( 0,0.01*Ndof*MSE00as/Neq )
MinGrad = MSEgoal/100
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
net.divideFcn = 'divideblock';
net.divideParam.trainRatio = 70/100;
net.divideParam.testRatio = 15/100;
net.divideParam.valRatio = 15/100;
for i = 1:Ntrials
net = configure(net,Xs,Ts);
[ net tr Ys ] = train(net,Xs,Ts,Xi,Ai);
ys = cell2mat(Ys);
stopcrit{i,j} = tr.stop
bestepoch(i,j) = tr.best_epoch
MSE = mse(ts-ys)
MSEa = Neq*MSE/Ndof
R2(i,j) = 1-MSE/MSE00s
R2a(i,j) = 1-MSEa/MSE00as
end
end
stopcrit = stopcrit
bestepoch = bestepoch
R2 = R2
R2a = R2a
Totaltime = toc
stopcrit =
Columns 1 through 2
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
'Minimum gradient...' 'Minimum gradient...'
Columns 3 through 4
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Minimum gradient...'
'Validation stop.' 'Minimum gradient...'
'Validation stop.' 'Validation stop.'
'Minimum gradient...' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Columns 5 through 6
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Columns 7 through 8
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.'
Column 9
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
'Validation stop.'
R2 =
Columns 1 through 6
0.9109 0.9116 0.9196 0.9156 0.9236 0.9242
0.9109 0.9116 0.9154 0.9213 0.9124 0.9229
0.9109 0.9116 0.9125 0.9185 0.9202 0.9285
0.9109 0.9116 0.9118 0.9230 0.9311 0.9253
0.9109 0.9117 0.9118 0.9201 0.9343 0.9240
0.9109 0.9115 0.9111 0.9125 0.9338 0.9224
0.9109 0.9116 0.9170 0.9177 0.9188 0.9353
0.9109 0.9116 0.9118 0.9137 0.9292 0.9320
0.9109 0.9115 0.9125 0.9129 0.9312 0.9313
0.9109 0.9116 0.9127 0.9286 0.9187 0.9199
Columns 7 through 9
0.9303 0.9336 0.9344
0.9282 0.9146 0.9393
0.9359 0.9305 0.9378
0.9182 0.9306 0.9355
0.9212 0.9384 0.9321
0.9334 0.9195 0.9339
0.9200 0.9374 0.9310
0.9239 0.9334 0.9175
0.9305 0.9201 0.9320
0.9274 0.9394 0.9311
R2a =
Columns 1 through 6
0.9107 0.9112 0.9190 0.9146 0.9223 0.9226
0.9107 0.9112 0.9147 0.9204 0.9110 0.9213
0.9107 0.9112 0.9118 0.9175 0.9189 0.9270
0.9107 0.9112 0.9110 0.9221 0.9300 0.9238
0.9107 0.9113 0.9111 0.9192 0.9332 0.9225
0.9107 0.9112 0.9104 0.9114 0.9327 0.9208
0.9107 0.9112 0.9163 0.9167 0.9175 0.9339
0.9107 0.9112 0.9111 0.9126 0.9281 0.9306
0.9107 0.9112 0.9118 0.9119 0.9300 0.9298
0.9107 0.9112 0.9120 0.9277 0.9174 0.9183
Columns 7 through 9
0.9286 0.9317 0.9322
0.9264 0.9121 0.9373
0.9343 0.9285 0.9357
0.9162 0.9286 0.9334
0.9192 0.9366 0.9298
0.9317 0.9172 0.9317
0.9180 0.9356 0.9287
0.9220 0.9315 0.9148
0.9288 0.9178 0.9297
0.9256 0.9377 0.9288
  2 Commenti
Christian Hofstetter
Christian Hofstetter il 25 Ago 2017
Data attached here
Greg Heath
Greg Heath il 13 Set 2017
Clicked on Data.mat but cannot find it on my machine.
Anyone knows were it is hiding ?
Greg (A failure at computer nerdism)

Accedi per commentare.

Risposta accettata

Greg Heath
Greg Heath il 1 Set 2017
Modificato: Greg Heath il 1 Set 2017
WHOA!!
I just took a look at the 2017a TRAINBR documentation.
Contrary to my previous statements:
YOU CAN USE TRAINBR TO COMBINE MSEREG &
VALSTOPPING!
Read the documentation
help trainbr
and
doc trainbr
because I'm not sure when this was implemented (Maybe it was because of my suggestions to the staff!) (;>)
In addition, you can also combine them using TRAINLM.
Hope this helps.
Thank you for formally accepting my answer
Greg
  4 Commenti
Greg Heath
Greg Heath il 2 Set 2017
So far I have these comments
1. Use UC for cells and lc for doubles
2. Use subscriots o and c for OL and CL, respectively
3. Remove semicolons from single value assignments
and print the result as a comment. For example
[ X, T ] = valve_dataset;
x = cell2mat(X); t = cell2mat(T);
[ I N ] = size(x) % [ 1 1801 ]
[ O N ] = size(t) % [ 1 1801 ]
vart1 = mean(var(ttrn',1)) % 1.4998e+03
vart0 = mean(var(ttrn',0)) % 1.5007e+03
4. In general, relevant sizes are determined from x & t, NOT X & T
5. Determine significant lags using Ntrn, not N
6. Determine Ntrneq from ttrn, not t.
Probably more later.
Greg
Christian Hofstetter
Christian Hofstetter il 3 Set 2017
Modificato: Christian Hofstetter il 3 Set 2017
So, the results in the question are for the valve data set.
And the results in the last comment (you moved from the answer box) belong to my real data set.
I modified the code according to your comments:
clear
plt = 0;
tic
[X,T] = valve_dataset;
x = cell2mat(X);
t = cell2mat(T);
[ I N ] = size(x); % [1 1801]
[ O N ] = size(t); % [1 1801]
MSE00 = mean(var(t',1)) % 1.4998e3
MSE00a = mean(var(t',0)) % 1.5007e3
% Normalization
zx = zscore(x, 1);
zt = zscore(t, 1);
Ntrn = N-2*round(0.1*N); % 1441
trnind = 1:Ntrn;
Ttrn = T(trnind);
Neq = prod(size(Ttrn)); % 1441
%%Determine significant lags
%%{
plt=plt+1,figure(plt)
subplot(211)
plot(t)
title('SIMPLENAR SERIES')
subplot(212)
plot(zt)
title('STANDARDIZED SERIES')
rng('default')
n = randn(1,Ntrn);
L = floor(0.95*(2*Ntrn-1)) % 2736
for i = 1:100
autocorrn = nncorr( n,n, Ntrn- 1, 'biased');
sortabsautocorrn = sort(abs(autocorrn));
thresh95(i) = sortabsautocorrn(L);
end
sigthresh95 = mean(thresh95) % 0.0404
autocorrt = nncorr(zt,zt,Ntrn- 1,'biased');
siglag95 = -1+ find(abs(autocorrt(Ntrn:2*Ntrn- 1))>=sigthresh95);
plt = plt+1, figure(plt)
hold on
plot(0:Ntrn-1, - sigthresh95*ones(1,Ntrn),'b--')
plot(0:Ntrn-1, zeros(1,Ntrn),'k')
plot(0:Ntrn-1, sigthresh95*ones(1,Ntrn),'b--')
plot(0:Ntrn-1, autocorrt(Ntrn:2*Ntrn-1))
plot(siglag95,autocorrt(Ntrn+siglag95),'ro')
title('SIGNIFICANT SIMPLENAR AUTOCORRELATIONS')
%INPUT-TARGET CROSSCORRELATION
%
crosscorrxt = nncorr(zx,zt,Ntrn- 1,'biased');
sigilag95 = -1+ find(abs(crosscorrxt(Ntrn:2*Ntrn- 1))>=sigthresh95); %significant feedback lag
%
plt = plt+1, figure(plt)
hold on
plot(0:Ntrn-1, -sigthresh95*ones(1,Ntrn),'b--')
plot(0:Ntrn-1, zeros(1,Ntrn),'k')
plot(0:Ntrn-1, sigthresh95*ones(1,Ntrn),'b--')
plot(0:Ntrn-1, crosscorrxt(Ntrn:2*Ntrn-1))
plot(sigilag95,crosscorrxt(Ntrn+sig ilag95),'ro')
title('SIGNIFICANT INPUT-TARGET CROSSCORRELATIONS')
%}
FD = 1:1; %Random Selection of sigflag subset (Feedback Delays) (5 one jump)
ID = 0:2; %Random selection of sigilag subset crosscorrelation (Input Delays) (13 one jump)
NFD = length(FD);
NID = length(ID);
MXFD = max(FD);
MXID = max(ID);
Ntrneq = prod(size(Ttrn)); % 1441
Hub = -1+ceil( (Ntrneq-O) / ((NID*I)+(NFD*O)+1));
Hmax = floor(Hub/10) % 28
Hmax = 10 % Results for the first 10 neurons
Hmin = 0
dh = 1
Ntrials = 10
j = 0;
rng('default')
for h = Hmin:dh:Hmax
fprintf(['_____________H %','d/%d_____________\n'],h,Hmax)
j = j+1
if h == 0
neto = narxnet( ID, FD, [] );
Nw = ( NID*I + NFD*O + 1 )*O
else
neto = narxnet( ID, FD, h );
Nw = ( NID*I + NFD*O + 1 )*h + ( h + 1 )*O
end
Ndof = Ntrn-Nw
[ Xo Xoi Aoi To ] = preparets( neto,X,{},T );
to = cell2mat(To);
xo = cell2mat(Xo);
MSE00o = mean(var(to',1))
MSE00ao = mean(var(to'))
MSEgoal = max( 0,0.001*Ndof*MSE00ao/Neq )
MinGrad = MSEgoal/10
neto.trainFcn = 'trainbr';
neto.performFcn = 'msereg';
neto.trainParam.max_fail = 6;
neto.trainParam.goal = MSEgoal;
neto.trainParam.min_grad = MinGrad;
neto.divideFcn = 'divideblock';
neto.divideParam.trainRatio = 80/100;
neto.divideParam.testRatio = 10/100;
neto.divideParam.valRatio = 10/100;
for i = 1:Ntrials
neto = configure(neto,Xo,To);
[ neto tro Yo Eo Aof Xof ] = train(neto,Xo,To,Xoi,Aoi);
yo = cell2mat(Yo);
stopcrit{i,j} = tro.stop
bestepoch(i,j) = tro.best_epoch
MSE = mse(to-yo)
MSEa = Neq*MSE/Ndof
NMSEo(i,j) = MSE/MSE00o
R2(i,j) = 1-MSE/MSE00o
R2a(i,j) = 1-MSEa/MSE00ao
end
end
stopcrit = stopcrit
bestepoch = bestepoch
R2 = R2
R2a = R2a
Totaltime = toc
%%Close net
[netc Xci Aci] = closeloop(neto,Xoi,Aoi); netc.name = [netc.name ' - Closed Loop'];
view(netc);
[ Xc Xci Aci Tc ] = preparets( netc,X,{},T );
[ Yc Xcf Acf ] = netc(Xc,Xci,Aci);
Ec = gsubtract(Tc,Yc);
yc = cell2mat(Yc);
tc = to;
NMSEc = mse(Ec) /var(tc,1)
R2c = 1-NMSEc
%%Retrain close loop net
% Note that isequal(Tc,To) = 1
[ netc trc Yc Ec Xcf Acf] = train( netc, Xc, Tc, Xci, Aci );
[Yc Xcf Acf ] = netc(Xc, Xci, Aci );
Ec = gsubtract( Tc, Yc );
yc = cell2mat(Yc);
tc = to;
NMSEc = mse(Ec) /var(tc,1)
R2c = 1-NMSEc
figure, plot(yc), hold on, plot(to)
And this is what I get (Valve data set) for Hmax = 10 and NTrials = 10, using trainbr, valstopping and msereg:
stopcrit =
10×11 cell array
Columns 1 through 7
''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' ''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' ''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' ''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' ''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' ''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
''Maximum MU reach…' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
Columns 8 through 11
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
R2 =
0.9191 0.9197 0.9235 0.9238 0.9350 0.9287 0.9459 0.9495 0.9462 0.9461 0.9494
0.9192 0.9197 0.9265 0.9224 0.9380 0.9411 0.9288 0.9461 0.9467 0.9468 0.9417
0.9191 0.9197 0.9215 0.9275 0.9233 0.9286 0.9435 0.9464 0.9454 0.9489 0.9458
0.9191 0.9197 0.9219 0.9234 0.9389 0.9274 0.9453 0.9347 0.9463 0.9394 0.9454
0.9191 0.9198 0.9200 0.9221 0.9298 0.9358 0.9249 0.9454 0.9226 0.9377 0.9371
0.9190 0.9194 0.9291 0.9239 0.9427 0.9231 0.9406 0.9383 0.9329 0.9492 0.9351
0.9192 0.9197 0.9232 0.9283 0.9234 0.9435 0.9221 0.9462 0.9279 0.9363 0.9455
0.9190 0.9197 0.9247 0.9286 0.9335 0.9238 0.9255 0.9487 0.9405 0.9461 0.9368
0.9192 0.9197 0.9215 0.9227 0.9291 0.9174 0.9347 0.9476 0.9416 0.9236 0.9465
0.9191 0.9197 0.9241 0.9218 0.9290 0.9305 0.9255 0.9271 0.9497 0.9463 0.9455
After closing the Net (for H = 10):
R2c =
0.9245
And after retraining netc:
R2c_retrained =
0.9245
Now stated here results for trainlm and msereg:
stopcrit =
10×11 cell array
Columns 1 through 7
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Minimum gradient …' 'Minimum gradient …' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
Columns 8 through 11
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
'Validation stop.' 'Validation stop.' 'Validation stop.' 'Validation stop.'
R2 =
0.9192 0.9197 0.9200 0.9292 0.9298 0.9325 0.9291 0.9465 0.9323 0.9451 0.9466
0.9192 0.9197 0.9199 0.9213 0.9252 0.9223 0.9364 0.9427 0.9462 0.9404 0.9493
0.9192 0.9198 0.9198 0.9332 0.9297 0.9264 0.9412 0.9321 0.9468 0.9340 0.9480
0.9192 0.9198 0.9202 0.9295 0.9166 0.9420 0.9309 0.9273 0.9451 0.9312 0.9478
0.9192 0.9198 0.9208 0.9222 0.9285 0.9323 0.9275 0.9319 0.9472 0.9469 0.9357
0.9192 0.9198 0.9350 0.9223 0.9261 0.9342 0.9415 0.9270 0.9358 0.9416 0.9380
0.9192 0.9198 0.9270 0.9270 0.9211 0.9428 0.9428 0.9310 0.9204 0.9364 0.9211
0.9192 0.9198 0.9271 0.9220 0.9329 0.9247 0.9442 0.9259 0.9377 0.9481 0.9400
0.9192 0.9198 0.9219 0.9238 0.9403 0.9431 0.9240 0.9411 0.9353 0.9431 0.9289
0.9192 0.9198 0.9250 0.9205 0.9343 0.9243 0.9267 0.9310 0.9480 0.9476 0.9349
R2c =
0.9230
R2c_retrained =
0.9281

Accedi per commentare.

Più risposte (2)

Greg Heath
Greg Heath il 28 Ago 2017
Validation Stopping prevents overtraining an overfit net.
Although the training error is decreasing, the ability of the net to perform satisfactorily on nontraining data (represented by the validation subset) is decreasing.
Alternatives
1. Choose the best of multiple designs: Minimize the number of hidden nodes subject to an upper bound on the training error. For open loop timeseries I tend to use
MSEgoal = 0.005*mean(var(target',1))
2. Bayesian Regularization via TRAINBR.
3.If the new versions of MATLAB allow it combine TRAINBR and VALSTOPPING.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Commento
Christian Hofstetter
Christian Hofstetter il 28 Ago 2017
Hi, thx for answering.
I am using:
MSE00as = mean(var(ts'))
MSEgoal = 0.01*Ndof*MSE00as/Neq
but this goal is never reached.
And when I set
net.trainFcn = 'trainbr';
I reach the min gradient but not my performance goal (I get R2=0.04).
I'm using matlab version R2015a. How can I combine trainbr and valstopping?
Maybe too much noise in my data set?

Accedi per commentare.


Greg Heath
Greg Heath il 3 Set 2017
Modificato: Greg Heath il 13 Set 2017
1. The ULTIMATE GOAL OF NN TRAINING is that the performance measures of BOTH
a. Training data
b. Nontraining data that have the same
summary statistics as the training data
are less than a specified upper bound
2. THEREFORE, it doesn't matter whether the training subset error is minimized or not.
3. HOWEVER, the most common way to achieve the goal in 1 is
a. Use a minimization algorithm to reduce the
training subset error.
b. Stop training when either reaches a local
minimum
i. Training subset error
ii. Nontraining validation subset error
Hope this helps.
Thank you for formally accepting my answer
Greg

Categorie

Scopri di più su Descriptive Statistics in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by