Neural network AIC and BIC calculation (number of parameters?)
18 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Egis Pavardenis
il 1 Apr 2015
Risposto: David Franco
il 8 Ago 2017
For now let's assume one hidden layer with 10 neurons 1:2 delay NARNET.
How to calculate AIC and BIC values?
So far i found that one way is suggested by warren-sarle
AIC = (n)log(SSE/n)+2p
BIC = (n)log(SSE/n)+(p)log(n)
Where: SSE be the sum of squared errors for the training set, n be the number of training cases, p be the number of parameters (weights and biases).
What's training cases and how to calculate them? Is there any function to get number of neural network parameters (like for example vgxcount for VARX models)?
Other option could be:
Mean=mean(est_data);
Covariance=cov(est_data);
CholCovar=chol(Covariance);
objective=ecmnobj(est_data,Mean,Covariance,CholCovar);
logL=objective*(-1);
numParam=length(est_data);
numObs=length(Data);
[aic,bic] = aicbic(logL, numParam, numObs);
I don't know if this suggestion is suitable, but there is clear problem with assumption that numParam is same as number of outputs, but then again, how to get number of neural network parameters?
Is the solution for AIC and BIC calculation same for NARX, fitnet and other neural network models?
0 Commenti
Risposta accettata
Greg Heath
il 1 Apr 2015
Search the NEWSGROUP and ANSWERS using
greg Nw
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Commenti
Più risposte (1)
David Franco
il 8 Ago 2017
After training the network and simulating the outputs:
[net,tr] = train(net,inputs,targets);
output = sim(net,inputs);
Get the parameters and calculate de criterions (Sarle, 1995):
% Getting the training targets
trainTargets = gmultiply(targets,tr.trainMask);
SSE = sse(net,trainTargets,output); % Sum of Squared Errors for the training set
n = length(tr.trainInd); % Number of training cases
p = length(getwb(net)); % Number of parameters (weights and biases)
% Schwarz's Bayesian criterion (or BIC) (Schwarz, 1978)
SBC = n * log(SSE/n) + p * log(n)
% Akaike's information criterion (Akaike, 1969)
AIC = n * log(SSE/n) + 2 * p
% Corrected AIC (Hurvich and Tsai, 1989)
AICc = n * log(SSE/n) + (n + p) / (1 - (p + 2) / n)
References:
- Akaike, H. (1969), "Fitting Autoregressive Models for Prediction". Annals of the Institute of Statistical Mathematics, 21, 243-247.
- Hurvich, C.M., and Tsai, C.L. (1989), "Regression and time-series model selection in small samples". Biometrika, 76, 297-307.
- Sarle, W.S. (1995), "Stopped Training and Other Remedies for Overfitting". Proceedings of the 27th Symposium on the Interface of Computing Science and Statistics, 352-360.
- Schwarz, G. (1978), "Estimating the Dimension of a Model". Annals of Statistics, 6, 461-464.
0 Commenti
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!