Neural network AIC and BIC calculation (number of parameters?)

87 views (last 30 days)
For now let's assume one hidden layer with 10 neurons 1:2 delay NARNET.
How to calculate AIC and BIC values?
So far i found that one way is suggested by warren-sarle
AIC = (n)log(SSE/n)+2p
BIC = (n)log(SSE/n)+(p)log(n)
Where: SSE be the sum of squared errors for the training set, n be the number of training cases, p be the number of parameters (weights and biases).
What's training cases and how to calculate them? Is there any function to get number of neural network parameters (like for example vgxcount for VARX models)?
Other option could be:
[aic,bic] = aicbic(logL, numParam, numObs);
I don't know if this suggestion is suitable, but there is clear problem with assumption that numParam is same as number of outputs, but then again, how to get number of neural network parameters?
Is the solution for AIC and BIC calculation same for NARX, fitnet and other neural network models?

Accepted Answer

Greg Heath
Greg Heath on 1 Apr 2015
Search the NEWSGROUP and ANSWERS using
greg Nw
Hope this helps.
Thank you for formally accepting my answer

More Answers (1)

David Franco
David Franco on 8 Aug 2017
After training the network and simulating the outputs:
[net,tr] = train(net,inputs,targets);
output = sim(net,inputs);
Get the parameters and calculate de criterions (Sarle, 1995):
% Getting the training targets
trainTargets = gmultiply(targets,tr.trainMask);
SSE = sse(net,trainTargets,output); % Sum of Squared Errors for the training set
n = length(tr.trainInd); % Number of training cases
p = length(getwb(net)); % Number of parameters (weights and biases)
% Schwarz's Bayesian criterion (or BIC) (Schwarz, 1978)
SBC = n * log(SSE/n) + p * log(n)
% Akaike's information criterion (Akaike, 1969)
AIC = n * log(SSE/n) + 2 * p
% Corrected AIC (Hurvich and Tsai, 1989)
AICc = n * log(SSE/n) + (n + p) / (1 - (p + 2) / n)
  • Akaike, H. (1969), "Fitting Autoregressive Models for Prediction". Annals of the Institute of Statistical Mathematics, 21, 243-247.
  • Hurvich, C.M., and Tsai, C.L. (1989), "Regression and time-series model selection in small samples". Biometrika, 76, 297-307.
  • Sarle, W.S. (1995), "Stopped Training and Other Remedies for Overfitting". Proceedings of the 27th Symposium on the Interface of Computing Science and Statistics, 352-360.
  • Schwarz, G. (1978), "Estimating the Dimension of a Model". Annals of Statistics, 6, 461-464.


Find more on Deep Learning with Time Series and Sequence Data in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by