Injection noise to CNN through customized training loop

3 visualizzazioni (ultimi 30 giorni)
Hi there.
I am using costumized loop to train my CNN. For designing my net, I need to inject Gaussian noise per each layer. I could not find in DL toolbox about noise layer and L2 regularization. I need to know how I can put a Gaussian noise layer (if there is) in my model and where exactly would be its place in layers ordering. Then how can I define L2 regularization consist with my costumized training loop (with dlNetwork(lgraph)). I mean, for computing loss function (using cross entropy) and gradient (using dlfeval(@gradientmodel, ...) ), should I add only 0.5*norm(dlnet.learnables) to loss and dlnet.learnables(i,:), where i refers to only weights or there is other approach to do this??
Thanks for any help.

Risposta accettata

Shashank Gupta
Shashank Gupta il 7 Gen 2021
Hi Mahsa,
There is no explicit layer for adding Gaussian noise to each layer in MATLAB. Although you can create one custom for you. Also check out this example. It talks about some gaussian custom layer which you can take help from. It will definitely help you.
Also all the parameter in trainingOption can be implemented in the custom loop function easily and this L2 can also. I suggest you to follow up this doc page. It gives a details explaination about how different parameter can be implemented when using custom training loop.
I hope it gives you a good headstart to process further.
Cheers.
  1 Commento
MAHSA YOUSEFI
MAHSA YOUSEFI il 10 Gen 2021
Modificato: MAHSA YOUSEFI il 10 Gen 2021
Thank you for your help. I am following your seggustions. Just one more thing about L2 regularization. In the page you liked it this, there is an update for only gradients not loss. I have to first update (adding regularized term to unregularized objective function: loss(w) = loss(w) + l2Regularization/(2N) * ||w||) and then consider the update for gradient as mentioned in this. Am I right?
Also in this link, "N" (sample sized using for computing loss and gradient) was ignored. I think the updating term must be as follow for gradient:
gradients(idx,:) = dlupdate(@(g,w) g + (l2Regularization./N)*w, gradients(idx,:), dlnet.Learnables(idx,:));
not
gradients(idx,:) = dlupdate(@(g,w) g + l2Regularization*w, gradients(idx,:), dlnet.Learnables(idx,:));

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by