Info

Questa domanda è chiusa. Riaprila per modificarla o per rispondere.

How do I determine the best model order of my Network? Also clarify some doubts regarding free parameters such as weights

1 visualizzazione (ultimi 30 giorni)
Hello,
I have a two dimensional input and one dimensional target output. My NN has one hidden layer with 10 neurons (hidden) in it
[I N] = size(input); % I = 2, N = 100
[O N] = size (target); % O = 1
I used nftool to do the curve fitting. I trained it and saw the plots,etc all.
My questions are as follows -
1. How do I see the equation with which the NN determined the 'estimated outputs'? 2. I didn't see how I could plot my estimated output vs input mesh-plot in 3D. X and Y axis assuming to be two dimensions of input and Z as the estimated output. Since I have only target value Z* (lets say), the syntax
mesh(X,Y,Z*)
doesn't work. Can anybody kindly help me out here?
3. The point of asking the equation of estimated outputs in terms of inputs is to determine the model order. And I want to change the model order in order to determine the optimized model complexity. Even if I get the equation, there was no where I can see, how could I change the model order of the polynomial from say 2nd degree polynomial to 4th or 5th degree polynomial. Whereas I have seen literature on fitting the curve from straight line to 10th order polynomial.
4. My task is to find the optimum level of Regularization in order to find the optimal order of my Network. Can anybody explain me what it is with a simple example or show me the direction I should head to for understanding the same. To me, optimization always meant to try varying no. of neurons in hidden layer and change the training, validation and testing samples by hit and trial to improve the error.
Now to clarify some doubts
1. I saw @GregHeath mentioned many a times the no. of unknown weights as
Nw = (I +1)*H+(H+1)*O
Can you kindly clarify this formula? I = 2 and 3 neuron in hidden layer (H = 3) and 1 output gives 9 free weight parameter to be adjusted ( Neural Networks Demystified [Part 3: Gradient Descent] ). Why is there +1 in your I and H terms above while having a product?
2. Is 'training function' in 'nntool' is a sort of backpropagation method? What is then meant by 'adaptive learning function' in the same?
3. Can somebody kindly explain with an example the terms related to 'connection' having created the network -
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
Thank you very much.
  2 Commenti
Greg Heath
Greg Heath il 30 Ago 2018
Modificato: Greg Heath il 30 Ago 2018
1. The "1"s take into account the bias weight connections.
2. Adaptive learning means each weight and bias are adapted
after EACH SINGLE measurement is inputted.
Hope this helps
Greg
Greg Heath
Greg Heath il 10 Nov 2018
% Asked by Harleen Singh Makhija on 29 Aug 2018 % Latest activity Answered by Akshay Kumar  on 4 Sep 2018 % % I have a two dimensional input and one dimensional target output. My NN % has one hidden layer with 10 neurons (hidden) in it % [I N] = size(input); % I = 2, N = 100 % [O N] = size (target); % O = 1 % I used nftool to do the curve fitting. I trained it and saw the plots, % netc all. % My questions are as follows - % % 1. How do I see the equation with which the NN determined the 'estimated % outputs'?
y = B2 + LW * tanh( IW * x + B1 )
% NOTE tanh(z) = 0.5*( exp(z) + exp(-z) )
% 2. I didn't see how I could plot my estimated output vs input % mesh-plot in 3D. X1 and X2 axis assuming to be two dimensions of input % and Y as the estimated output. Since I have only target value T (lets % say), the syntax mesh(X1,X2,T) doesn't work. Can anybody kindly help me % out here?
It can be done. However, I'm to rusty to help. Check the 3-d plot section of your manual
% 3. The point of asking the equation of estimated outputs in terms of % inputs is to determine the model order. And I want to change the model % order in order to determine the optimized model complexity. Even if I get % the equation, there was no where I can see, how could I change the model % order of the polynomial from say 2nd degree polynomial to 4th or 5th % degree polynomial. Whereas I have seen literature on fitting the curve % from straight line to 10th order polynomial.
Neural networks are described in terms of sums of tanh or sigmoid functions, not polynomials.
% 4. My task is to find the optimum level of Regularization
Incorrect terminology
%in order to find % the optimal order of my Network. Can anybody explain me what it is with a % simple example or show me the direction I should head to for understanding % the same. To me, optimization always meant to try varying no. of neurons % in hidden layer and change the training, validation and testing samples by % hit and trial to improve the error.
Correct. I have posted zillions of examples in both
COMP.SOFT-SYS.MATLAB & ANSWERS
% Now to clarify some doubts % % 1. I saw @GregHeath mentioned many a times the no. of unknown weights as % Nw = (I +1)*H+(H+1)*O % % Can you kindly clarify this formula? I = 2 and 3 neuron in hidden layer % (H = 3) and 1 output gives 9 free weight parameter to be adjusted ( Neural % Networks Demystified [Part 3: Gradient Descent] ). Why is there +1 in your % I and H terms above while having a product?
The 1 indicates the value of an input layer constant bias connection. The input bias is multiplied by B1 and connected to the inputs of each of the H hidden node functions.
Similarly, the output bias is multiplied by B2 and connected to te output.
% 2. Is 'training function' in 'nntool' is a sort of backpropagation method? % What is then meant by 'adaptive learning function' in the same?
adaptation: weights are adjusted one at a time training : weights are adjusted simultaneously via matrix multiplication
% 3. Can somebody kindly explain with an example the terms related to % 'connection' having created the network
My best guess:
biasConnect: [1; 1] ==> bias node: connected to the hidden and output layers
inputConnect: [1; 0] ==> input node: connected to the hidden layer
layerConnect: [0 0; 1 0]==> hidden layer connected to the output layer
outputConnect:[0 1 ] ==> output layer connected to the output node
Hope this helps
Greg

Risposte (1)

Akshay Kumar
Akshay Kumar il 4 Set 2018
3. biasConnect: Defines which layers have biases.The presence (or absence) of a bias to the ith layer is indicated by a 1 (or 0).
inputConnect: Defines which layers have weights coming from inputs.The presence (or absence) of a weight going to the ith layer from the jth input is indicated by a 1 (or 0).
layerConnect: Defines which layers have weights coming from other layers. The presence (or absence) of a weight going to the ith layer from the jth layer is indicated by a 1 (or 0) .
outputConnect: Defines which layers generate network outputs. The presence (or absence) of a network output from the ith layer is indicated by a 1 (or 0).
You can go through the below link to know more about these properties.
  1 Commento
Greg Heath
Greg Heath il 10 Nov 2018
% Asked by Harleen Singh Makhija on 29 Aug 2018 % Latest activity Answered by Akshay Kumar  on 4 Sep 2018 % % I have a two dimensional input and one dimensional target output. My NN % has one hidden layer with 10 neurons (hidden) in it % [I N] = size(input); % I = 2, N = 100 % [O N] = size (target); % O = 1 % I used nftool to do the curve fitting. I trained it and saw the plots, % netc all. % My questions are as follows - % % 1. How do I see the equation with which the NN determined the 'estimated % outputs'?
y = B2 + LW * tanh( IW * x + B1 )
% NOTE tanh(z) = 0.5*( exp(z) + exp(-z) )
% 2. I didn't see how I could plot my estimated output vs input % mesh-plot in 3D. X1 and X2 axis assuming to be two dimensions of input % and Y as the estimated output. Since I have only target value T (lets % say), the syntax mesh(X1,X2,T) doesn't work. Can anybody kindly help me % out here?
It can be done. However, I'm to rusty to help. Check the 3-d plot section of your manual
% 3. The point of asking the equation of estimated outputs in terms of % inputs is to determine the model order. And I want to change the model % order in order to determine the optimized model complexity. Even if I get % the equation, there was no where I can see, how could I change the model % order of the polynomial from say 2nd degree polynomial to 4th or 5th % degree polynomial. Whereas I have seen literature on fitting the curve % from straight line to 10th order polynomial.
Neural networks are described in terms of sums of tanh or sigmoid functions, not polynomials.
% 4. My task is to find the optimum level of Regularization
Incorrect terminology
%in order to find % the optimal order of my Network. Can anybody explain me what it is with a % simple example or show me the direction I should head to for understanding % the same. To me, optimization always meant to try varying no. of neurons % in hidden layer and change the training, validation and testing samples by % hit and trial to improve the error.
Correct. I have posted zillions of examples in both
COMP.SOFT-SYS.MATLAB & ANSWERS
% Now to clarify some doubts % % 1. I saw @GregHeath mentioned many a times the no. of unknown weights as % Nw = (I +1)*H+(H+1)*O % % Can you kindly clarify this formula? I = 2 and 3 neuron in hidden layer % (H = 3) and 1 output gives 9 free weight parameter to be adjusted ( Neural % Networks Demystified [Part 3: Gradient Descent] ). Why is there +1 in your % I and H terms above while having a product?
The 1 indicates the value of an input layer constant bias connection. The input bias is multiplied by B1 and connected to the inputs of each of the H hidden node functions.
Similarly, the output bias is multiplied by B2 and connected to te output.
% 2. Is 'training function' in 'nntool' is a sort of backpropagation method? % What is then meant by 'adaptive learning function' in the same?
adaptation: weights are adjusted one at a time training : weights are adjusted simultaneously via matrix multiplication
% 3. Can somebody kindly explain with an example the terms related to % 'connection' having created the network
My best guess:
biasConnect: [1; 1] ==> bias node: connected to the hidden and output layers
inputConnect: [1; 0] ==> input node: connected to the hidden layer
layerConnect: [0 0; 1 0]==> hidden layer connected to the output layer
outputConnect:[0 1 ] ==> output layer connected to the output node
Hope this helps
Greg

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by