Optimum MSE for neural networks
11 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
hi, I was designing a neural network using the app in Matlab and I the MSE (mean squared error)that I got in the training set is 100-200. I don't have any more data to improve the network. So, should I go forward with the network or improve it in any other way?
0 Commenti
Risposta accettata
Greg Heath
il 8 Set 2015
Modificato: Greg Heath
il 8 Set 2015
Impossible to tell without knowing or being able to calculate the normalized degree-of-freedom-adjusted (DOFA) training subset MSE (NMSEtrna) or the corresponding Rsquare or coefficient of determination (Rsqtrna = 1-NMSEtrna).
Search Rsquare and "coefficient of determination" on Google and Wikipedia.
Calculations can be made as follows
[ I N ] = size(x)
[ O N ] = size(t) = size(y)
Ntrn = No. of training examples ( Ntrn ~ 0.7*N is default )
Ntrneq = Ntrn*O % No. of training equations
MSEtrn00 = mean(var(ttrn',1))% avg training target variance
SSEtrn = sse(ttrn-ytrn)
MSEtrn = SSEtrn/Ntrneq
NMSEtrn = MSEtrn/MSEtrn00
Rsqtrn = 1 - NMSEtrn
Adjustments ("a") for degrees of freedom lost when evaluating the net with the same data that was used to estimate the weights:
H = number of hidden nodes
Nw = (I+1)*H+(H+1)*O % No. of unknown weights
Ndof = Ntrneq - Nw % No. of DOF
MSEtrn00a = mean(var(ttrn',0))
NOTE: If there are more unknown weights than equations, Ndof < 0 and special methods like trainbr and/or regularization are required. Otherwise,
For DOFA with Ndof > 0:
MSEtrna = SSEtrn/Ndof
NMSEtrna = MSEtrna/MSEtrn00a
Rsqtrna = 1 - NMSEtrna
For many problems an appropriate training goal is
R2sqtrna >= 0.99
or
MSEtrn <= MSEtrngoal = 0.01*max(Ndof,0)*MSEtrn00a/Ntrneq
Hope this helps.
Thank you for formally accepting my answer
Greg
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!