Normalization for a neural network

10 visualizzazioni (ultimi 30 giorni)
bodie
bodie il 7 Set 2016
Commentato: bodie il 8 Set 2016
Hello everyone, i am currently trying to predict new settings for a machine based on already known settings. The Question is basically: "Which settings do I need to use if the material put into the machine has the proberties X,Y and Z. For this task I decided to use a neural network. Now I have a problem with normalizing my data an hope somebody can help me out.
I have 14 samples each for the 4 Inputs and the 1 Target. Unfortunatly my prediction for new data is bad. In addition the MSE and regression are not great as well.
I made a observation that makes me believe the problem could be in the normalization of my data. My Input Data is organized like this:
Variable1>> 500 510 490 Variable2>> 0.1 0.12 0.20 etc.
I first used mapstd to normalize the data. The results were not much better than without using normalisation (I am using the nntool, so there is normalization build in anyway). The next thing I tried was to use zscore. Suddenly the results of the neural network was much better. I suppose normalizing the columns does "even out" my sample data?
My problem is now: How can i normalize the new data before i use it as a Input to the neural network, and how can the de-normalize the Prediction of the network? As zscore normalises the columns, the mean and std are now of the size 1x14. Is there a way to normalize my new Data the same way like the Input und my prediction like my Output?
I have tried to denormalize my sollution like this:
Y = meanOutput + stdOutput * X
The problem is that the dimensions of X and stdOutput und meanOutput do not fit.
Does anyone know how to solve this?

Risposta accettata

Greg Heath
Greg Heath il 7 Set 2016
MAPSTD and ZSCORE perform the same zero-mean/unit-variance transformation.
If you can figure out why you are not getting the same answer then you should have the general solution to your problem.
Hope this helps
Greg
  3 Commenti
Greg Heath
Greg Heath il 8 Set 2016
NORMALIZE ROW VARIABLES, NOT COLUMN SAMPLES.
NN data should occur with N pairs of "I"-dimensional "i"nputs and "O"-dimensional "O"utput targets.
[I N } = size(inputs)
[O N ] = size(targets)
I prefer to use ZSCORE to standardize the variables to zero-mean/unit-variance. This allows using MAPMINMAX for easy detection of outliers which should be removed or modified before training. However, because I am lazy, I do not change the MATLAB [-1,1] default because MATLAB has probably scaled the initial weights to fit that normalization.
For multidimensioned variables you have to double transpose to get zx = zscore(x',1)'; However, transposition is not necessary for single dimensioned variables.
Hope this helps.
Greg
P.S. I have posted zillions of examples in both the NEWSGROUP and ANSWERS
bodie
bodie il 8 Set 2016
I have now solved the problem, in case anyone has the same problem here is how i solved it:
Because the variables had a big difference in size (Variable 1 = 500 and Variable 2 = 0.02) it was important to NORMALIZE THE COLUMNS! The thought behind this was that a change in Variable 1 has more influence on the wheigts than Variable 2. I found hints about this behaviour in "Iglewicz, B. (1983), "Robust scale estimators and confidence intervals for location", in Hoaglin, D.C., Mosteller, M. and Tukey, J.W., eds., Understanding Robust and Exploratory Data Analysis, NY: Wiley."
I used zscore on my Input-Matrix to achieve this. The Output-Matrix was scaled to to [-1,1] by multiplying with a factor X, but not normalised with zscore or mapstd. This way i can easily rescale my prediction by multiplying it with X^(-1).
Thanks for your help!
Best regards, bodie

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by