Azzera filtri
Azzera filtri

Classification problem in neural network code

1 visualizzazione (ultimi 30 giorni)
Odrisso
Odrisso il 9 Lug 2016
Commentato: Odrisso il 11 Lug 2016
I have developed a code for ANN BP to classify snore segments. I have 10 input features and 1 hidden layer with 10 neuron and one output neuron. I denoted 1 as no snore and 0 as snore segment. I have 3000 segments and among them 2500 are no snore segments which are marked as 1. and 500 snore segments which are marked as 0. I already divided the data set in three sets (70% training, 15% validation and 15% testing). I also used the bias.
Now, while training the network, first I shuffled the training set and mixed the snore and no snore segments all together. So, After I trained the network, when I validate it (by only feed forward network), I found that it can only classify one of them. Let me clear it further, suppose, in the training set the last element is no snore (which is 1). So, it trained the network for that last output. Then in the validation phase, it always give output close to 1 even for snore segments (which is 0). Same thing happen if the last element is snore (0). Then it gives output close to 0 all the time in validation phase. Actually,The problem is in memorizing the previous weights for one label (suppose 0). Then when the other label suppose (1) come in the network, it forgets the weights for previous elements. I tried several hidden layer, 2 output layers. But, the problem remain same.
Here is the code:
x = xshuffled;
y = yshuffled;
m = length(y);
b1 = ones(m, 1);
[l,b] = size(x);
[n,o] = size(y);
numFeatures = size(x,2);
alpha = 1;% Learning Rate
g = inline('1.0 ./ (1.0 + exp(-z))');
% a = zeros (numFeatures,1);
Maxitr = 50; % Number of iterations
numofhiddenlayer = 1;
numofhiddenneuroninhl1=10;
numofhiddenneuroninhl2=3;
numofoutputneuron = 1;
%Initialize the weights
for i=1:numofhiddenlayer+1
if i==1
hll1=numofhiddenneuroninhl1;
V=rand(numFeatures,hll1);
VB=rand (hll1,1);
W = rand(hll1,numofoutputneuron);
WB = rand(numofoutputneuron,1);
end
end
for ii = 1:m
[pp,qq] = size(V);
d1 = 0;
for iii = 1:Maxitr
V = V;
VB = VB;
W = W;
WB = WB;
% Forward Propagation
% Layer 1
for jk=1:numofhiddenneuroninhl1
z (jk) = x(ii,:)*V(:,jk)+VB(jk);
a (jk) = g(z(jk));
end
% Layer 2
zout= a * W + WB;
aout (ii) = g(zout);
% Back Propagation
errorValue (ii,iii) = (1/2)*((y(ii) - aout).^2);
d1 = (aout-y(ii))*aout*(1-aout);
W1 = W';
% Update of Layer 1 Weights
for jj = 1:pp
for jjj = 1:qq
V(jj,jjj)= V(jj,jjj)- (alpha*d1*W1(1,jjj)*a(jjj)*(1-a(jjj))*x(ii,jj));
end
end
% Update of Layer 1 Bias Weights
for ij=1:size (VB)
VB (ij) = VB(ij) - (alpha*d1*W1(1,ij)*a(ij)*(1-a(ij)));
end
% Update of Layer 2 Weights
for j = 1:size(W)
W(j)= W(j)- (alpha*d1*a(j));
end
% Update of Layer 2 Bias Weights
WB = WB - (alpha*d1);
end
end
%%validation
xval =xtest;
yval = ytest;
m1= length (yval);
Vval = V;
VBval = VB;
Wval = W;
WBval = WB;
for ii = 1:m1
[pp,qq] = size(Vval);
[rr,zz] = size(Wval);
for jk=1:numofhiddenneuroninhl1
zval (jk) = xval(ii,:)*Vval(:,jk)+VBval(jk);
aval (jk) = g(zval(jk));
end
% Layer 2
zoutval= aval * Wval (:,ik)+WBval(ik);
aoutval (ii) = g(zoutval);
end
Yscore = aoutval';
Ypred = round (Yscore);
%%model accuracy
corrc = sum (Ypred == ytest)/length (ytest) * 100;
fprintf ('Preiction Accuracy: %0.2f%%\n',corrc);
How can I solve this problem? Why Can't my network did not memorize the output for previous segments. It only saves for the last segment? What should I change in the network to solve it?
  4 Commenti
Greg Heath
Greg Heath il 11 Lug 2016
Sorry, having trouble with Windows 10. Try again with *.txt or *,m
Greg
Odrisso
Odrisso il 11 Lug 2016
Hi Greg please check the attached .m files.

Accedi per commentare.

Risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by