What functions is the patternnet for the hidden layer and output layers?
Mostra commenti meno recenti
Hi, I am trying to learn how a NN works. I created a NN using MatLab patternet to classify XOR. However, when I input manually, it has a different result than the net(input). According to this article, if you use the GUI, it used the sigmoid transfer functions in both the hidden layer and the output layer, bullet 7. And if you used cmd-line, it used the tan-sigmoid transfer functions in both the hidden and output layers, bullet 2. I tried both version and it still give me a different result.Here is my code:
input = [0 0; 0 1; 1 0; 1 1]';
xor = [0 1; 1 0; 1 0; 0 1]';
% Create a larger sample size
input10 = repmat(input,1,10);
xor10 = repmat(xor,1,10);
% MatLab NN
net = patternnet(2);
net = train(net, input10, xor10);
% Get the weights
IW = net.IW;
b = net.b;
LW = net.LW;
IW = [IW{1}'; b{1}'];
LW = [LW{2}'; b{2}'];
%%Using tan-sigmoid
% Input to hidden layer
hid = zeros(2,1);
hidsig = zeros(2,1);
in = input(:,1);
for i = 1:2
hid(i) = dot([in;1],IW(:,i));
hidsig(i) = tansig(hid(i));
end
% Hidden to output layer without normalization
out = zeros(2,1);
outsig = zeros(2,1);
for i = 1:2
out(i) = dot([hidsig;1],LW(:,i));
outsig(i) = tansig(hidsig(i));
end
outsoftmax = softmax(out);
outsoftmaxsig = softmax(outsig);
% Hidden to output layer with normalization
normout = zeros(2,1);
normoutsig = zeros(2,1);
normhidsig = hidsig./norm(hidsig);
for i = 1:2
normout(i) = dot([normhidsig;1],LW(:,i));
normoutsig(i) = tansig(normhidsig(i));
end
normoutsoftmax = softmax(normout);
normoutsoftmaxsig = softmax(normoutsig);
result = net(in);
disp(result);
disp('tan-sigmoid');
disp(outsig);
disp(outsoftmax);
disp(outsoftmaxsig);
disp(normoutsig);
disp(normoutsoftmax);
disp(normoutsoftmaxsig);
%%Using sigmoid
% Input to hidden layer
hid = zeros(2,1);
hidsig = zeros(2,1);
in = input(:,1);
for i = 1:2
hid(i) = dot([in;1],IW(:,i));
hidsig(i) = sigmf(hid(i),[1,0]);
end
% Hidden to output layer without normalization
out = zeros(2,1);
outsig = zeros(2,1);
for i = 1:2
out(i) = dot([hidsig;1],LW(:,i));
outsig(i) = sigmf(hidsig(i),[1,0]);
end
outsoftmax = softmax(out);
outsoftmaxsig = softmax(outsig);
% Hidden to output layer with normalization
normout = zeros(2,1);
normoutsig = zeros(2,1);
normhidsig = hidsig./norm(hidsig);
for i = 1:2
normout(i) = dot([normhidsig;1],LW(:,i));
normoutsig(i) = sigmf(normhidsig(i),[1,0]);
end
normoutsoftmax = softmax(normout);
normoutsoftmaxsig = softmax(normoutsig);
result = net(in);
disp('sigmoid');
disp(outsig);
disp(outsoftmax);
disp(outsoftmaxsig);
disp(normoutsig);
disp(normoutsoftmax);
disp(normoutsoftmaxsig);
Risposta accettata
Più risposte (0)
Categorie
Scopri di più su Deep Learning Toolbox in Centro assistenza e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!