MATLAB Answers

rakbar
0

MATLAB Dropout layer during prediciton

Asked by rakbar
on 7 Jan 2019
Latest activity Answered by Don Mathis on 17 Jan 2019
The Documentation for a Dropout layer states that:
"At prediction time the output of a dropout layer is equal to its input."
I assume this means that during prediction, there is no dropout.
Is there a method in MATLAB to enable Dropout during prediction time?

  0 Comments

Sign in to comment.

3 Answers

Answer by Vishal Bhutani on 10 Jan 2019

Based on my understanding dropout layer is used to avoid over-fitting of the neural network. The term "dropout" refers to dropping out units (both hidden and visible) in a neural network. This type of functionality is required at time of training of network. At the time of testing whole network is considered i.e all weights are accountable. So during testing or prediction output of dropout layer is equal to its input.
Its better if you tell your usecase, it might help to understand issue in more detail.
Hope it helps.

  1 Comment

rakbar
on 10 Jan 2019
Some recent studies and tests have shown that when the Dropout layer is also active during prediction times, the prediction interval (or confidence interval) of the target can also be estimated. See for Example:
"A Theoretically Grounded Application of Dropout in Recurrent Neural Networks" https://arxiv.org/abs/1512.05287
At prediction time, the idea is to perform a few Monte-Carlo like loops for each target to obtain a distriution predictions. Then reporting the mean and standard deviation as the final target prediction and its error. These can be done in Python/Keras/Tensorflow.

Sign in to comment.


Answer by Greg Heath
on 11 Jan 2019

help dropout
... It is important to note that when creating a network, dropout will only be used during training.
Hope this helps.
Thank you for formally accepting my answer
Greg
I

  0 Comments

Sign in to comment.


Answer by Don Mathis on 17 Jan 2019

You could write yourself a custom dropout layer that does dropout in both the forward() and predict() methods. For dropout rate p, it would set each activation to 0 with probability p and then multiply all activations by 1/(1-p).
I'm not sure, but you might be able to give it a writeable 'p' property so you could set it to 0 after training if you want.

  0 Comments

Sign in to comment.