Neural Network Output :Scaling the output range.
9 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hi,
The output layer of my neural network (3 layered) is using sigmoid as activation which outputs only in range [0-1]. However, if I want to train it for outputs that are beyond [0-1], say in thousands, what should I do?
For example if I want to train
input ----> output
0 0 ------> 0
0 1 ------> 1000
1000 1 ----> 1
1 1 -------> 0
My program works for AND, OR, XOR etc. As input output are all in binary.
There were some suggestion to use,
Activation:
-----------
y = lambda*(abs(x)*1/(1+exp(-1*(x))))
Derivative of activation:
-------------------------
lambda*(abs(y)*y*(1-y))
This did not converge for the mentioned training pattern. Are there any suggestion please?
0 Commenti
Risposta accettata
Greg Heath
il 31 Gen 2012
Hello Greg,
Thanks again for answering the question. For my case, no rigid bound,
1. INCORRECT. ALL 3 VARIABLES ARE BOUNDED:
0 <= X1, Y <= 1000
0<= X2 <= 1.
2. HOWEVER, SINCE THE INPUT SCALES ARE DIFFERENT BY A FACTOR OF THOUSAND, X1 AND Y SHOULD BE TRANSFORMED BY VIA LOGS AND/OR POWERS. E.G.,
X1n = LOG10( 1 + X1 ) / LOG10( 1001 ) ==> 0 <= X1n <= 1
SIMILARLY FOR Y
HOPE THIS HELPS.
GREG
Più risposte (1)
Greg Heath
il 29 Gen 2012
If the target has rigid bounds, scale the data to either [0,1] or [-1,1] and use either LOGSIG or TANSIG, respectively.
Otherwise, standardize to zero-mean/unit variance and use PURELIN.
To recover the original data scale, just use the reverse tranformations.
Hope this helps.
Greg
Vedere anche
Categorie
Scopri di più su Define Shallow Neural Network Architectures in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!