not getting good results while simulation in NNTOOL
Mostra commenti meno recenti
hello sir, i am using nntool of matlab for my prediction my application is to forecast water demand based on previous consumption. year month day net supply 2003 1 1 2900 2003 1 1 2915 . .
2010 12 31 3400.
my input matrix has 3 inputs month day and date and output matrix is simply net supply.i have trained my network using
nntool i got good results in training,validation and testing.ie(0.93).but when i try to simulate the network using nntool to forecast the consumption
for 2011 year i am not getting good results.
i mean if the actual consumption on 1st jan 2011 is 3400 i am getting it as 2900.
can you give me any sugession how to improve the results.
Risposta accettata
Più risposte (5)
Greg Heath
il 7 Dic 2011
0 voti
It is difficult to give advice when you haven't fully specified your problem.
How much data do you have per year? Is it evenly spaced?
Clarify the ranges of day and date (1to7, 1to31 or 1to366?)
To better understand your data, I recommend you color code and overlay plots of the 8 output series vs each input. Is the 2010 data similar to that of the previous years? Is it similar to the average?
I would think that day of the year and day of the week should be sufficient inputs.
What net, node topology and parameters did you use?
Any other important info?
Hope this helps.
Greg
1 Commento
niranjan sane
il 7 Dic 2011
niranjan sane
il 7 Dic 2011
0 voti
Greg Heath
il 8 Dic 2011
0 voti
My nane is Greg; not Gary.
To better understand your data, I recommend you color code and overlay plots of the 9 output series vs each input. Are the 2010 and 2011 data similar to that of the previous years? Are they similar to the average? If so, the day of the year and day of the week should be sufficient inputs. Do those two plots look better than the three that you already have?
You are using 8 years for design. In order to try to optimize the design, I recommend REPEATEDLY using the first 7 years for training, and the 8th year for validation. I typically loop over 10 candidate designs obtained with random intial weights automatically determined by newff for each of 10 candidate values for No of hidden nodes, H. The net with the minimum of the 100 validation set errors is chosen.
For an I-H-O = 3-H-1 topology with Ntrn = 2+7*365 = 2557, the number of training equations is Neq = Ntrn*O = 2557 and the number of unknown weights is Nw = (I+1)*H+(H+1)*O = O +(I+O+1)*H = 1+5*H.
For robust estimation of weights it is desired that Neq >> Nw or
H << (Neq-O)/(I+O+1) = 2556/5 ~ 511
I suggest two runs:
1. A coarse search of 50 designs for H = 10:10:50 to determine a smaller range Hmin:Hmax 2. A finer search of 100 designs over the smaller range.
Hope this helps.
Greg
niranjan sane
il 9 Dic 2011
Greg Heath
il 9 Dic 2011
>thanks sir,actually i am very new to neural network so not able to >understand wha you are tring to convey. yes the data is nearly same >for 2010 and 2011 year with little change in it.
I asked about the similarity of 2010 and 2011 to the 7 earlier years
> i have constructed a neural network having 3 layers.
No. NNs are characterized by the number of weight layers ; NOT the number of node layers. You have a 2-layer net.
Since all classification and regression nets have inputs and outputs,
It is better to characterize the nets by the number of hidden layers. Then there is no source of confusion.
>input layer having 3 inputs and one
>output layer i used 6yeras data for training and 2years data for >testing.
WHY?? I told you previously that you needed a validation set to determine the best of many candidate designs! The test set is used once and only once on the best candidate chosen by the validation set. That is a proper way to get an UNBIASED prediction of performance on nondesign data when the single best or a best group of candidate designs are chosen.
>nntool is giving me good results but poor prediction.
I'm not surprised. When training to convergence without regularization, good performance on training data is not sufficient for expecting good performance on nontraining data. That is why you need a validation set.
> so i developed one code but this dode is giving me worst reults. can > you tell me wats wrong in code.
Basically, you did not use a validation set to choose the best of many candidate designs.
>clc; clear qll; %preparing inpput data
>%training datas
-----SNIP
>%testing data
-----SNIP
Categorie
Scopri di più su Deep Learning Toolbox in Centro assistenza e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!