Importing Caffe network error ' Scale layer without a preceding BatchNorm layer'
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
I am trying to use a pretrained Caffe model of a CNN network ( TrailNet_SResNet-18 from here ) for comparison purposes and there is a problem that I cant solve . when use
importCaffeNetwork(protofile,datafile)
I get this error :
'The layer 'sub_mean' specifies a Scale layer without a preceding BatchNorm layer. Scale layers are only supported when
preceded by a BatchNorm layer'
one of the contribution of the authours was the idea of removing the Batch Normalization layer and substituting the ReLU layer with Shifted ReLU. looking closely at the structure of the network shows that it contain a lot of 'Scale' layers.
I tried to find another importable version of the same net ( Keras or ONNX) but couldnt find any. Also tried to convert it to Keras or ONXX but failed too with many errors that have been reported to the founder of the convertors.
I thought about reproducing the network on Matlab, hoewever, there is no shfted ReLU as well as the main problem above which is ' without a preceding BatchNorm layer'.
I hope that anyone can help me to find a solution for this problem or any work around it .
0 Commenti
Risposte (1)
Shashank Gupta
il 29 Ago 2019
If we see the original paper of Batch Normalization, the author mentioned that, “we make sure that the transformation inserted in the network can represent the identity transform”. Without the Scale layer after the BatchNorm layer will not work, Since Caffe BatchNorm layer has no learnable parameters.
You can look at the BatchNormalization paper here:
Also, there is no explicit function for Shifted ReLU in MATLAB, but you can always define a custom activation layer.
Refer to this link for more information
I hope it helps!
Vedere anche
Categorie
Scopri di più su Image Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!