Time series prediction using LSTM
    7 visualizzazioni (ultimi 30 giorni)
  
       Mostra commenti meno recenti
    
Dear All;
I am trying to build an LSTM model to prodict the repsone of time series (deterministic) but the result is not good at all .
i try to change the parameters but still i can get good results. could you help how can i imporve the results
The code is below and i attached the data.
data=Y;
figure (2)
plot(data)
xlabel("case")
ylabel("fouling")
title("fouling plot")
numTimeStepsTrain = floor(0.95*numel(data));
dataTrain = data(1:numTimeStepsTrain+1);
dataTest = data(numTimeStepsTrain+1:end);
mu = mean(dataTrain);
sig = std(dataTrain);
dataTrainStandardized = (dataTrain - mu) / sig;
XTrain = dataTrainStandardized(1:end-1);
YTrain = dataTrainStandardized(2:end);
numFeatures = 1;
numResponses = 1;
numHiddenUnits = 100;
layers = [ ...
    sequenceInputLayer(numFeatures)
    lstmLayer(numHiddenUnits)
    fullyConnectedLayer(numResponses)
    regressionLayer];
options = trainingOptions('adam', ...
    'MaxEpochs',250, ...
    'GradientThreshold',1, ...
    'InitialLearnRate',0.005, ...
    'LearnRateSchedule','piecewise', ...
    'LearnRateDropPeriod',125, ...
    'LearnRateDropFactor',0.2, ...
    'Verbose',false, ...
    'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
dataTestStandardized = (dataTest - mu) / sig;
XTest = dataTestStandardized(1:end-1);
net = predictAndUpdateState(net,XTrain);
[net,YPred] = predictAndUpdateState(net,YTrain(end));
numTimeStepsTest = numel(XTest);
for i = 2:numTimeStepsTest
    [net,YPred(:,i)] = predictAndUpdateState(net,YPred(:,i-1),'ExecutionEnvironment','cpu');
end
YPred = sig*YPred + mu;
YTest = dataTest(2:end);
rmse = sqrt(mean((YPred-YTest).^2))
figure
plot(dataTrain(1:end-1))
hold on
idx = numTimeStepsTrain:(numTimeStepsTrain+numTimeStepsTest);
plot(idx,[data(numTimeStepsTrain) YPred],'.-')
hold off
xlabel("Time")
ylabel("Fouling Factor")
title("Fouling Prediction")
legend(["Observed" "Forecast"])
figure
subplot(2,1,1)
plot(YTest)
hold on
plot(YPred,'.-')
hold off
legend(["Observed" "Forecast"])
ylabel("Cases")
title("Forecast")
subplot(2,1,2)
stem(YPred - YTest)
xlabel("Time")
ylabel("Error")
title("RMSE = " + rmse)
0 Commenti
Risposte (2)
  Shashank Gupta
    
 il 11 Dic 2019
        Hi,
While working on LSTM, we cannot have a final, definite, rule of thumb on how many layers or nodes or hidden neuron/units one must choose, this are all hyperparameter and very often a trail and error approach will give you the considerable better results. The most common framework people use is “K-fold Validation”. Maybe you should consider looking at it. 
Every LSTM layer should be accompanied by a Dropout layer. It helps to prevent from overfitting. For choosing the optimizer, adaptive moment estimation or ADAM works well. Also MATLAB provide a way to get the optimal hyperparameter for training models, May be this link give you an idea of how to approach the problem. 
Hope this helps.
1 Commento
  lotus whit
 il 23 Ott 2021
				
      Modificato: lotus whit
 il 23 Ott 2021
  
			Hi
can you please to specify the minmum number  of data (rows), to get a good reult of prediction ,because i have 33 entry(as time series from 1988:2012), but the result varied when i tried to duplicate the value  to get good predictor?
  AMMAR ATIF
 il 17 Ago 2022
        Hi,
Reduce the LearnRateDropFactor, you can make it 0.1 and increase the number of epochs to 1000 as long as the training time is only 2 mins, the obtained RMSE error is  9.2668e-06, which is perfect !! 
0 Commenti
Vedere anche
Categorie
				Scopri di più su Deep Learning Toolbox in Help Center e File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!