What is the meaning of the training state "Sum Squared Param (ssX )" while training neural network with Bayesian Regularization algorithm?

7 visualizzazioni (ultimi 30 giorni)
While solving an Input-Output Fitting problem with a Neural Network by training with Bayesian Regularization algorithm, we can plot neural network training state. I attached an example figure here. The question I would like to ask that what is the meaning of Sum Squared Param (ssX) ? I just learnt "Num paramaters" is corresponding to effective number of paramaters but when I searched for "Sum Squared Param" I could't find any direct explanation. Is it sum squared weights (SSW)?

Risposta accettata

Harsha Vardhan
Harsha Vardhan il 11 Set 2023
Modificato: Harsha Vardhan il 11 Set 2023
Hi,
I understand that you want to know the meaning of the “Sum Squared Param (ssX)” in the plot of the neural network state. This term means the Sum of the Squared Parameters. Here, parameters refer to the neural network’s weight and bias vector.
This conclusion can be drawn from the following sources.
Execute the following command in the command window to view the source code of ‘trainbr.m’ (Bayesian Regularization training function).
open trainbr
In the above source code, the following lines explain about ‘ssX’.
Line 195: This line stores the neural network’s weight and bias vector in a variable.
worker.WB = calcLib.getwb(calcNet);
Line 211: This line calculates the ‘ssX’ by multiplying the transpose of weight and bias vector with itself.
worker.ssX = worker.WB' * worker.WB;
Next, to understand the value returned by calcLib.getwb in the line 195 of trainbr.m file, navigate to the 'getwb' function defined in 'getwb.m'. This can be done using 'CTRL+D' after highlighting 'getwb' in that line.
As seen in the above source code of 'getwb.m', ‘getwb’ function returns all network weight and bias values as a single vector:
function wb = getwb(net,hints)
%GETWB Get all network weight and bias values as a single vector.
Using the above resources, we can conclude that the "Sum Squared Param (ssX)" refers to the squared neural network’s weight and bias vector. This is used for the regularization.

Più risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by