clamp cross-entropy loss
3 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Matt Fetterman
il 3 Set 2020
Commentato: Matt Fetterman
il 6 Set 2020
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?
0 Commenti
Risposta accettata
David Goodmanson
il 3 Set 2020
Modificato: David Goodmanson
il 6 Set 2020
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
3 Commenti
Più risposte (0)
Vedere anche
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!