Can somebody explain me how to use "divideind"??

2 visualizzazioni (ultimi 30 giorni)
I am using Neural Network tool box for pattern recognition. The tool box uses random values of the input matrix for training test validation of defined percentage which results in different performance graph every time i train it.
I read that if i generate the advanced script and use divideind i can fix the matrix of validation,testing and training. But i'm not sure how to use it and what amendments should be made in the advance script. Kindly Help.
P.S dont tell me to read help and doc its use less (atleast for me)

Risposta accettata

Greg Heath
Greg Heath il 17 Nov 2013
Modificato: Greg Heath il 17 Nov 2013
The first time use as many defaults as possible. Defaults and basic code examples are listed in
help patternnet and
doc patternnet
Also, if you remove the semicolon, all the defaults will be revealed via
net = patternnet(hiddenLayerSize)
Once you are error free, start to make changes.
Even the correct code may not work because of an unfortunate set of initial weights. Therefore, with the correct number of hidden nodes I usually design 10 nets in a loop. The best is chosen by the lowest validation set error. The prediction of performance on unseen data (generalization) is obtained from the corresponding test set error.
Often the default number of hidden nodes(10) is nonoptimal. My solution is a double loop design with the outer loop over ~ 10 candidate values.
Details are in my posts obtained from searching
greg patternnet Ntrials
Hope this helps.
Thank you for formally accepting my answer
Greg
P.S. More later.

Più risposte (5)

Greg Heath
Greg Heath il 17 Nov 2013
[ inputs, targets ] = simpleclass_dataset;
[ I N ] = size(inputs) % [ 2 1000 ]
[ O N ] = size(targets) % [ 4 1000 ]
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
view(net)
net.divideFcn = 'divideind';
net.divideParam.trainInd = 151:850;
net.divideParam.valInd = 1:150;
net.divideParam.testInd = 851:1000;
[net,tr] = train(net,inputs,targets);
view(net)
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
  3 Commenti
Greg Heath
Greg Heath il 21 Nov 2013
1. It depends upon the application. For classification or pattern-recognition, VEC2IND is the most common.
2. NaNs are ignored.
Greg Heath
Greg Heath il 28 Gen 2014
I forgot to apply the mask to the outputs when calculating trn/val/tst performance!

Accedi per commentare.


Greg Heath
Greg Heath il 28 Gen 2014
Your original problem of nonrepeatibility is easily solved by initializing the RNG before it is used to divide data or initialize weights. If you search using
greg Ntrials
you will see the command
rng(0)
However, you can use any positive integer, e.g., rng(4151941). This tends to be preferable because random division eliminates any bias in the way the data was collected.
However, I will find one of my examples that uses divideind and post the URL.
Greg

Greg Heath
Greg Heath il 28 Gen 2014
{close all, clear all, clc
[ x, t ] = simpleclass_dataset;
[ I N ] = size(x) % [ 2 1000]
[ O N ] = size(t) % [ 4 1000]
trueclassind = vec2ind(t);
ind1 = find(trueclassind == 1);
ind2 = find(trueclassind == 2);
ind3 = find(trueclassind == 3);
ind4 = find(trueclassind == 4);
N1 = length(ind1) % 243
N2 = length(ind2) % 247
N3 = length(ind3) %233
N4 = length(ind4) %277
minmax1 = minmax(ind1) % [ 5 993 ]
minmax2 = minmax(ind2) % [ 1 1000 ]
minmax3 = minmax(ind3) % [ 4 996 ]
minmax4 = minmax(ind4) % [ 6 985 ]
mean(diff(trueclassind)) % 0 Classes completely mixed up
trnind = 1:700;
valind = 701:850;
tstind = 851:1000;
Ntrn = 700
Nval = 150
Ntst = 150
Ntrneq = Ntrn*O
MSEtrn00 = mean(var(t(trnind)',1)) % 0.1875
MSEtrn00a = mean(var(t(trnind)',0)) % 0.1878
MSEval00 = mean(var(t(valind)',1)) % 0.1892
MSEtst00 = mean(var(t(tstind)',1)) % 0.1858
% Create a Pattern Recognition Network
H = 10;
net = patternnet(H);
Nw = (I+1)*H+(H+1)*O % 74
Ndof = Ntrneq-Nw % 2726
net.divideFcn = 'divideind';
net.divideParam.trainInd = trnind;
net.divideParam.valInd = valind;
net.divideParam.testInd = tstind;
[net tr y e ] = train(net,x,t); % e = t-y
% Test the Network
MSEtrn = mse(e(trnind)) % 1.5629e-7
MSEtrna = Ntrneq*MSEtrn/Ndof % 1.6053e-7
R2trn = 1-MSEtrn/MSEtrn00 % 1}
R2trna = 1-MSEtrna/MSEtrn00a % 1
R2val = 1-mse(e(valind))/MSEval00 % 1
R2tst = 1-mse(e(tstind))/MSEtst00 % 1}

Greg Heath
Greg Heath il 11 Nov 2013
Try something and post it. If it is wrong maybe someone can help.

Mehrukh Kamal
Mehrukh Kamal il 17 Nov 2013
% Solve a Pattern Recognition Problem with a Neural Network
% Script generated by NPRTOOL
% Created Sun Nov 17 15:22:59 PKT 2013
%
% This script assumes these variables are defined:
%
% ho - input data.
% Target - target data.
inputs = ho;
targets = Target;
% Create a Pattern Recognition Network
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly *I Know i have to do amendments here like dividerand will be divideind*
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 70/100; *this will be trainInd= ??? _i dont know what to write here_ *
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% For help on training function 'trainscg' type: help trainscg
% For a list of all training functions type: help nntrain
net.trainFcn = 'trainscg'; % Scaled conjugate gradient
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotconfusion(targets,outputs)
%figure, plotroc(targets,outputs)
%figure, ploterrhist(errors)
and what else should i have to change?
  4 Commenti
Greg Heath
Greg Heath il 28 Gen 2014
1. Please remove all statements that are covered by defaults
2. Test on one of MATLAB's example data sets for classification/pattern-recognition
help nndatasets
doc nndatasets
Their example for patternnet is the iris_dataset. However, that is a multidimensional input set. Try one of the single dimensional sets.
Greg Heath
Greg Heath il 28 Gen 2014
Sorry there are no single dimensional input examples. Just use
simpleclass_dataset

Accedi per commentare.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by