The resulting weights and biases of the autoencoder training doesn't give the same encoded value when calculated manually.

4 visualizzazioni (ultimi 30 giorni)
clc
clear all
close all
%% Generate mxk Cosine wave training data with noise
n = [0:99]';
b = 0.15*pi;
w = b*ones(1,1000);
S = cos(w.*n);
[m,k] = size(S);
a = 0.375; % noise variance
X = S+sqrt(a)*randn(m,k);
%% Define the no. of hidden layers
hiddenSize = 2;
%% Build the autoencoder
autoenc = trainAutoencoder(X,hiddenSize,...
'MaxEpochs', 1000, ...
'EncoderTransferFunction','satlin',...
'DecoderTransferFunction','purelin',...
'L2WeightRegularization',0.01,...
'SparsityRegularization',4,...
'SparsityProportion',0.10);
%% View AE structure
view(autoenc)
%% Generate 10 testing data and plot against predicted and original signals
for i=1:10
X_test = cos(b*n)+sqrt(a)*randn(m,1);
xReconstructed = predict(autoenc,X_test);
subplot(2,1,1), plot(n, X_test,n, xReconstructed,'Linewidth',2)
grid on
legend('Noisy','Reconstructed')
%% ..........................
subplot(2,1,2), plot(n, cos(b*n),n, xReconstructed,'Linewidth',2)
grid on
legend('Original','Reconstructed')
pause
end
WW = autoenc.EncoderWeights; % WW are the weights after autoencoder training
BB = autoenc.EncoderBiases; % BB are the biases after autoencoder training
X_test = cos(b*n);
encoded_result = encode(autoenc,X_test)
encoded_result =
0.0956 % the value of the hidden neurons after using the function encoded
0.0859
then we use the equation of encoded which is
z=h(1)(W(1)x+b(1)),
our code is
encoded_manually = ((WW * X_test) + BB);
we use sigmoid function
for i3 = 1:hiddenSize
logsig(i3,1) = 1 / (1 + exp(-encoded_manually(i3,1)));
end
logsig
finally our results :
logsig =
0.6631
0.4902
So, there is a mismatch between the results when calculated using the function "encode" and when calculated manually using the function.provided in the website for the function "encode"
if anyone can help us to know where is the mistake in our code and how can we have same result between encoded function and manual function

Risposte (1)

Anshika Chaurasia
Anshika Chaurasia il 8 Ott 2021
Modificato: Anshika Chaurasia il 8 Ott 2021
Hi,
The reason for discrepancy is that the inputs are preprocessed before encoding i.e., inputs are transformed using the 'mapminmax' function.
You can also notice this by converting the autoencoder to a network object:
net = network(autoenc);
net.inputs{1}.processFcns
Try the following snippet to manually calculate:
net = network(autoenc);
% preprocess data
for iii = 1:numel(net.inputs{1}.processFcns)
inputs = feval( net.inputs{1}.processFcns{iii}, ...
'apply', X_test, net.inputs{1}.processSettings{iii} );
end
% manually encode
encoded_manually = satlin((WW * inputs) + BB)

Prodotti


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by