Why the results of my Elman network are different every time?
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Heather Zhang
il 31 Lug 2015
Commentato: Greg Heath
il 13 Ago 2015
I created a elman network. But the results every time I run the code were different.I got different "errors","regression" and "avg_error" Could anyone tell me why? Appreciate SO MUCH!
Here is the code.
clear all
load('input4_train.mat');
load('output4_train.mat');
load('input4_test.mat');
load('output4_test.mat');
inputSeries = tonndata(input4_train,false,false);
targetSeries = tonndata(output4_train,false,false);
inputTest = tonndata(input4_test,false,false);
outputTest = tonndata(output4_test,false,false);
% Create a Network
hiddenLayerSize = 5;
net=newelm(inputSeries,targetSeries,[10,3,1], {'tansig','logsig','purelin'});
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net.trainParam.epochs = 2000;
% Initial net
net = init(net);
% Train the Network
net = adapt(net,inputSeries,targetSeries);
% Test the Network
outputs = sim(net,inputTest);
errors = gsubtract(outputTest,outputs);
error = cell2mat(errors);
for i = 1:10
error(i)=abs(error(i));
end
avg_error = sum(error)/10;
performance = perform(net,outputTest,outputs)
% View the Network
view(net)
% Plots
figure, plotregression(outputTest,outputs)
figure, plotresponse(outputTest,outputs)
figure, ploterrcorr(errors)
3 Commenti
Greg Heath
il 1 Ago 2015
Modificato: Walter Roberson
il 2 Ago 2015
% load('input4_train.mat');
% load('output4_train.mat');
% load('input4_test.mat');
% load('output4_test.mat');
%
% inputSeries = tonndata(input4_train,false,false);
% targetSeries = tonndata(output4_train,false,false);
% inputTest = tonndata(input4_test,false,false);
% outputTest = tonndata(output4_test,false,false);
whos
% % Create a Network
% hiddenLayerSize = 5;
Value never used
% net=newelm(inputSeries,targetSeries,[10,3,1], {'tansig','logsig','purelin'});
No justification for 3 hidden layers. One is sufficient.
% % Setup Division of Data for Training, Validation, Testing
% net.divideParam.trainRatio = 70/100;
% net.divideParam.valRatio = 15/100;
% net.divideParam.testRatio = 15/100;
Above 3 commands unnecessary for default values.
% net.trainParam.epochs = 2000;
%
% % Initial net
% net = init(net);
%
% % Train the Network
% net = adapt(net,inputSeries,targetSeries);
%
% % Test the Network
% outputs = sim(net,inputTest);
%
% errors = gsubtract(outputTest,outputs);
% error = cell2mat(errors);
% for i = 1:10
% error(i)=abs(error(i));
% end
% avg_error = sum(error)/10;
Above 5 commands unnecessary.
help mae
doc mae
% performance = perform(net,outputTest,outputs)
% % View the Network
% view(net)
%
% % Plots
% figure, plotregression(outputTest,outputs)
% figure, plotresponse(outputTest,outputs)
% figure, ploterrcorr(errors)
Defaults: Above 3 commands unnecessary;
Risposta accettata
Walter Roberson
il 31 Lug 2015
Neural Networks initialize their weights randomly usually. If you want repeatability you can initialize the weights yourself or you can set the random number generator seed.
2 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Deep Learning Toolbox in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!