Data classification: Learning vector Quantization or two-layer feed-forward network???

2 visualizzazioni (ultimi 30 giorni)
hello,
I'm experimenting how to to use neural networks for data classification using the iris flowers data set.
If you type 'nprtool' in the command line you can load the iris flowers data set and it will classify it using a two-layer feed-forward network, with sigmoid hidden and output neurons. Its classifies 98% correctly.
Mathworks also say that you can use a Learning Vector Quantization (LVQ) network to classify data also. There is a separate example here <http://www.mathworks.co.uk/support/solutions/en/data/1-5RDETB/index.html?product=NN&solution=1-5RDETB>
and they also demonstrate LVQ with the iris flowers data set here http://www.mathworks.co.uk/help/toolbox/nnet/ref/lvqnet.html
Using LVQ it only classifies 90-92% correctly.
My question is, why is there difference in the amount classified correctly? and ultimately which is the best method to use?
Thank you

Risposta accettata

Greg Heath
Greg Heath il 23 Nov 2011
LVQ is not self organizing. It is created using supervised learning. The FFMLP has a different topology and has been proven to be a universal approximator. The LVQ topology is similar to that of the RBF. The RBF has been proven to be a universal approximator. The shortcomings of LVQ1 have been addressed by LVQ2. However, it, also, is not a universal approximator.
Hope this helps.
Greg

Più risposte (3)

Vito
Vito il 9 Nov 2011
LVQ it is used for Self-Organizing or Unsupervised networks. In such networks there is no corrector of errors. At the big attractiveness, algorithm of training very difficult. The task in that in the course of interaction with "external environment", the network was trained itself, on the basis of some criteria. It is clear that already similar problem definition reduces accuracy.
Self-Organizing Networks are certainly very interesting, if the task allows, it is possible to use practically. But are much more interesting to a science.

Vito
Vito il 23 Nov 2011
I apologize. But.
LVQ1 - Learning Vector Quantization. Used competitive learning. Creation task independent measure during training, does not change essence. And competitive learning defines Self-Organizing net.
Or only «universal approximator» defines Self-Organizing net?

Greg Heath
Greg Heath il 26 Nov 2011
I don't agree. Reread your sources.
This is my understanding:
Unsupervised learning creates a representation of data based purely on the similarity of the characteristics of the individual data points. Except for defining the similarity measure, no omnipotent agent is involved. Moreover, the definition or creation of data categories is not required.
Self-organization uses unsupervised learning to assign data to categories. The assignments are purely based on the similarity of each data point to members of each category.
Supervised learning uses data labeled by an omipotent agent to create a procedure for assigning arbitrary data to labeled categories.
Self-organization implies unsupervised learning.
Universal approximation is the ability to approximate a function or mapping closer than a specified measure of error by using a finite number of operations.
Hope this helps.
Greg

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by