mean squared logarithmic error regression layer
7 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
VICTOR CATALA
il 14 Giu 2019
Commentato: Erdem AKAGUNDUZ
il 16 Mar 2020
I'm trying to write a MSLE regression layer
Here is my code:
"
classdef msleRegressionLayer < nnet.layer.RegressionLayer
% Custom regression layer with mean-absolute-logarithmic-error loss.
methods
function layer = msleRegressionLayer(name)
% layer = maleRegressionLayer(name) creates a
% mean-absolute-logarithmic-error regression layer and specifies the layer
% name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'Mean squared logarithmic error';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the MSLE loss between
% the predictions Y and the training targets T.
% Calculate MSLE.
R = size(Y,1);
%meanAbsoluteError = sum(abs(Y-T),3)/R;
msle=sum((log10((Y+1)/(T+1))).^2,1)/R;
% Take mean over mini-batch.
N = size(Y,2);
loss = sum(msle,2)/N;
end
function dLdY = backwardLoss(layer, Y, T)
% Returns the derivatives of the MSLE loss with respect to the predictions Y
R = size(Y,1);
N = size(Y,2);
dLdY = 2*(log10(Y+1)-log10(T+1))./(N*R).*1./(Y+1).*ln(10);
end
end
end
"
In this case, size of x_train is 1024 x 500000 and size of Y_train is 1 x 500000.
Any help is wellcome
2 Commenti
Risposta accettata
VICTOR CATALA
il 27 Giu 2019
3 Commenti
Erdem AKAGUNDUZ
il 16 Mar 2020
Hello Victor,
Nice job with MSLE Loss layer, and thanks.
I have a question actually, and I hope you can help me.
Why do we divide by the mini-batch size (N = size(Y,4)) in the backwardLoss function?
I know the examples in MATLAB help also does this. but I don't undestand it, so I am looking for an answer.
For example:
if the output of the network (that goes in the loss function) is 224x224x1xN
then we expect the size of dLdY to be the same as 224x224x1xN.
So why do we divide by this gradient by N. We did NOT sum over the gradients in the mini-batch dimension. So why average along that dimension?
Thank you very much.
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Gaussian Process Regression in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!