Forward function with frozen batch normalization layers
Mostra commenti meno recenti
In my application i have both batch normalization and dropout, and i would like to perform MC dropout with the forward function, and ideally i would freeze the parameters TrainedMean and TrainedVariance for the batch normalization layers, but i cannot seem to understand is it possible. I have the bn layers after conv layers, and the dropout after the recurrent layer in my net. Thank you in advance
1 Commento
Imola Fodor
il 28 Feb 2024
Risposta accettata
Più risposte (0)
Categorie
Scopri di più su Deep Learning Toolbox in Centro assistenza e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
