Setting up a 3 layered back-propagation neural network

7 visualizzazioni (ultimi 30 giorni)
I'm trying to set up a neural network with the following requirements -
  • three-layered;
  • feed-forward;
  • classical tan-sigmoid and linear functions in the hidden and output layers, respectively;
  • 5 neurons in the hidden layer;
  • trained with the Levenberg-Marquardt back-propagation algorithm
  • converges in 5 iterations
Basically, the neural network is to be trained by giving an RGB map input (3 values) and target output skin parameters (3 values). I've tried using the 'nntool' Matlab wizard and but am unsure if 'nftool' is the one I'm looking for. Unsure, because it says it's 2 layered and there's no option to make it converge in 5 iterations.
I'm new to setting up neural networks as that really isn't the main focus of my project. My question is - is the 'nftool' wizard the thing I'm after and are there settings in it that meet my listed neural network requirements? If not, is there some sort of coding template I can alter to create and train my neural network?
  2 Commenti
José-Luis
José-Luis il 28 Giu 2016
Modificato: José-Luis il 28 Giu 2016
How could you make it converge in five iterations? Convergence is not something you can impose. On the other hand, you could try and make it quit after five iterations.
stayfrosty
stayfrosty il 28 Giu 2016
Well, I'm trying to reproduce the results of a study and it did say the network "converges in 5 iterations". All I'm attempting to do is reproduce the results according to the details given. I guess, apart from that detail, is the 'nftool' the Matlab tool I should be using?

Accedi per commentare.

Risposte (1)

Greg Heath
Greg Heath il 2 Lug 2016
Only hidden and output nodes are considered being in neuron layers because they are associated with non-identity transfer functions. Input nodes are only considered to be fan-in units, not neurons. Therefore, although there are three layers of nodes, you have a two-layer network because there are only two layers of neurons.
You cannot duplicate designs without knowing the random number seed from which random initial weights and random trn/val/tst data division are obtained.
If you want to use the GUI, the fitting tool nfttool is appropriate.
However, I prefer the command line approach similar to the examples in the HELP and DOC documentation and the zillions of examples I have posted in both the NEWSGROUP & ANSWERS.
help fitnet
doc fitnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Commento
stayfrosty
stayfrosty il 6 Lug 2016
Modificato: stayfrosty il 6 Lug 2016
Thank you for your reply. Could you point me to examples which you think are relevant to my dilemma? Here is a video link to the study whose neural network I was initially trying to replicate.
When I first undertook this as a project, I didn't expect to have to mess around with neural networks. Unfortunately, to reproduce/replicate the findings of this study, it looks like a must do.

Accedi per commentare.

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Prodotti

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by