Effective number of parameters in a neural network
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
Hello ;
I'm training a neural network using the Bayesian approach. In the documentation, I read the following : "One feature of this algorithm is that it provides a measure of how many network parameters (weights and biases) are being effectively used by the network."
But I don't quite understand something : once I know the amount of effective parameters, what can I do with this information? For starter, how come some of the parameters are not used? Why are some weights inactive? Secondly, can knowing that help me prune the network and reduce the amount of neurons, for example? If yes, how? If no, then what is the practical use of that piece of information?
Thanks in advance for your help!
J
0 Commenti
Risposta accettata
Greg Heath
il 19 Mag 2013
TRAINBR automatically chooses the weighting ratio that multiplies the sum of squared weights that is added to the sum of squared errors to form the objective function. The choice depends on the effective number of weights.
I don't recall the formula, however, you should be able to find it in the source code, it's references, or online.
The only way I can see you using it is if you use TRAINLM with the regularization option of mse. In that case the user chooses the ratio. However, I don't know of a good reason to do that instead of using TRAINBR.
Hope this helps.
Thank you for formally accepting my answer.
Greg
0 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!