Deep Learning Custom Layer learning parameters update

2 visualizzazioni (ultimi 30 giorni)
Hello,
I am working on a deep Learning project In which I try to classify data from a csv. I tryed to use a custom layer but when I train the network my Loss Function seems "constant" as if the weight is not updated.
Do you know what could be the reason of this behavior ?
I am sure of my dataset because when I use a fullyConnected Layer instead of my custom layer the training works perfectly and the testing gives me 100% accuracy.
I also give you the predict and the backward function from my custom layer where Weight is a learning parameter:
function Z = predict(layer, X)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
W = layer.Weights;
numel=size(X,2);
% Initialize output
Z = zeros(layer.OutputSize,numel,"single");
% Weighted addition
for k=1:numel
for j=1:layer.OutputSize
for i = 1:layer.InputSize
Z(j,k) = Z(j,k) + W(j,i)*X(i,k);
end
end
end
end
function [dLdX,dLdWeight]=backward(layer,X,~,dLdZ,~)
%Initialization
W=layer.Weights;
dLdWeight=zeros(size(W),"single");
dLdX=zeros(size(X),"single");
%Backward operation
for k=1:size(X,2)
for j=1:layer.OutputSize
for i=1:layer.InputSize
dLdWeight(j,i)=dLdWeight(j,i)+X(i,k)*dLdZ(j,k);
dLdX(i,k)=dLdX(i,k)+W(j,i)*dLdZ(j,k);
end
end
end
end
Thank you in advance for your futur help.
Mathieu

Risposta accettata

yanqi liu
yanqi liu il 13 Gen 2022
yes,sir,may be add dropoutLayer or batchNormalizationLayer to model
  1 Commento
Mathieu Chêne
Mathieu Chêne il 14 Gen 2022
Thank you for your answer.
I tryed it with a dropoutLayer and it seems to work. My accuracy increases and my loss decreases/
Thank you
Mathieu

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Image Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by