How to avoid getting negative values when training a neural network?
Mostra commenti meno recenti
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.
Thanks
Risposta accettata
Più risposte (1)
Greg Heath
il 18 Gen 2020
0 voti
Use a sigmoid for the output layer.
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
1 Commento
Mostafa Nakhaei
il 18 Gen 2020
Categorie
Scopri di più su Deep Learning Toolbox in Centro assistenza e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!