Azzera filtri
Azzera filtri

Multi Step ahead Prediction

4 visualizzazioni (ultimi 30 giorni)
fariha Shahid
fariha Shahid il 6 Gen 2021
Modificato: fariha Shahid il 6 Gen 2021
Hello there,
I am trying to predict the solar irradiance for the year 2022 and 2023, and have the data for the year 2015-2017. Following is the code that I am running and its running smooth but I am confused with a few things.
what is the range of inputSeriesVal and targetseriesVal ? I believe it is the part of the data already present. is it true?
what are inputSeriesPred and targetSeriesPred? as per the values given, (inputSeriesPred=[inputSeries(end-delay+1:end),inputSeriesVal]), this is also the part of the data given. correct?
Now what I am unable to understand that which part is exactly predicting the data of 2022 and 2023?
what should be the values of N and NI?
there is a gap in years 2018-2021. where this gap is to be incorporated?
clear all; close all; clc;
Data=load ('NoTimeKarachi.csv');
length(Data); %64424
x= Data(:,1:11);
y= Data(:,12);
%% Normalize Data between 0 & 1 and Transform Inputs
y2= (y-min(y))/(max(y)-min(y));
for i=3:11
x2(:,i)= (x(:,i)-min(x(:,i)))/(max(x(:,i))-min(x(:,i)));
end
plot(x2(:,5),y2,'o');
%% Scales the series to [-1,1]
ymax = 1; ymin = -1;
for i=3:11
x2(:,i)= (ymax-ymin)*(x(:,i)-min(x(:,i)))/(max(x(:,i))-min(x(:,i)))+ymin;
end
y2= (ymax-ymin)*(y2-min(y2))/(max(y2)-min(y2))+ymin;
%Defines every column after Normalization
DAte=x(:,1); Wind_Speed=x2(:,7);
Time=x(:,2); Wind_Speed_Gust=x2(:,8);
DNI=x2(:,3); Wind_Direction_STD=x2(:,9);
DHI=x2(:,4); Wind_Direction=x2(:,10);
Temp=x2(:,5); Pressure=x2(:,11);
Humidity=x2(:,6); GHI=y2;
%% Applying the Moving Average(MA) – 5min
%Converting to 5min step time series with 5min MA
P3=mediaMovel(DNI,5); P4=mediaMovel(DHI,5);
P5=mediaMovel(Temp,5); P6=mediaMovel(Humidity,5);
P7=mediaMovel(Wind_Speed,5); P8=mediaMovel(Wind_Speed_Gust,5);
P9=mediaMovel(Wind_Direction_STD,5); P10=mediaMovel(Wind_Direction,5);
P11=mediaMovel(Pressure,5); P12=mediaMovel(GHI,5);
P6_=P6(1:5:end);
P7_=P7(1:5:end);
P3_=P3(1:5:end); P8_=P8(1:5:end);
P4_=P4(1:5:end); P9_=P9(1:5:end);
P5_=P5(1:5:end); P10_=P10(1:5:end);
P11_=P11(1:5:end); P12_=P12(1:5:end); %Remove data with a 5step interval
p5em5=[P3_',P4_', P5_',P6_',P7_',P8_',P9_',P10_',P11_', P12_'];
%%
N1=12885; %length of the time series with 5 min step
N = 288; % Multi-step prediction (1 day) – 5min step
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
% 2nd group: this is the new data used for simulation.
inputseries = p5em5(1:N1-N,1:9);
targetseries = p5em5(1:N1-N,10);
%inputSeriesVal will be used for predicting new targets. targetSeriesVal will be used for network validation
%after prediction
inputseriesVal = p5em5(N1-N+1:N1,1:9);
targetseriesVal = p5em5(N1-N+1:N1,10); % This is generally not available
inputSeries = tonndata(inputseries,false,false);
targetSeries = tonndata(targetseries,false,false); %to neural network data command
inputSeriesVal = tonndata(inputseriesVal,false,false);
targetSeriesVal = tonndata(targetseriesVal,false,false);
%% Network Architecture
% Create a Nonlinear Autoregressive Network with External Input
delay = 16; %number of tapped delays
jj=0;
for neuronsHiddenLayer = 10 %Number of Neurons in the Hidden Layer
jj=jj+1;
% Network Creation
Ntrial = 1; %number of training trials
for ji = 1: Ntrial %Number of tests
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
% Training the network
[Xsinputs,XiinputStates,AilayerStates,Tstargets] = preparets(net,inputSeries,{},targetSeries);
% Customize training parameters
net.trainFcn = 'trainlm'; % Levenberg-Marquardtalgotihm
net.trainParam.epochs = 1000;
net.divideFcn = 'divideblock';
net.divideParam.trainRatio = 60/100;
net.divideParam.valRatio = 20/100;
net.divideParam.testRatio = 20/100;
% Choose a Performance Function:
net.performFcn = 'mse'; % Mean squared error
%activation functions
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
% Train the Network
[net,tr] = train(net,Xsinputs,Tstargets,XiinputStates,AilayerStates);%training the network
Y = net(Xsinputs,XiinputStates,AilayerStates);%Simluations
% Performance for the series-parallel implementation, only one-step-ahead prediction
errors = gsubtract(Tstargets,Y);
%Results analysis - Series-parallel
MSEt(ji,jj) = mse(net,Tstargets,Y);%mean square error
RMSEt(ji,jj) = sqrt(MSEt(ji,jj));%root mean square error
MAEt(ji,jj) = mae(net,Tstargets,Y);%mean absolute error
% 5. Multi-step ahead prediction
inputSeriesPred = [inputSeries(end-delay+1:end),inputSeriesVal];
targetSeriesPred = [targetSeries(end-delay+1:end),con2seq(nan(1,N))];
netc = closeloop(net);%starts the feedback process
view(netc)
[Xsinputs,XiinputStates,AilayerStates,Tstargets] = preparets(netc,inputSeriesPred,{},targetSeriesPred);
yPred = netc(Xsinputs,XiinputStates,AilayerStates);
% FORECASTING Results analysis
MSEf(ji,jj) = mse(netc,targetSeriesVal,yPred)
RMSEf(ji,jj) = sqrt(MSEf(ji,jj))
MAEf(ji,jj) = mae(netc,targetSeriesVal,yPred)
end
end
%% CONVERT [-1,1] to [0,1]
targetSeries_rev = ((cell2mat(targetSeries)) - ymin) / (ymax-ymin)* (max(cell2mat(targetSeries))-min(cell2mat(targetSeries)) + min(cell2mat(targetSeries)));
targetSeriesVal_rev =((cell2mat(targetSeriesVal)) - ymin) / (ymax-ymin)*(max(cell2mat(targetSeriesVal)) - min(cell2mat(targetSeriesVal)) +min(cell2mat(targetSeriesVal)));
yPred_rev = ((cell2mat(yPred)) - ymin) /(ymax-ymin)*(max(cell2mat(yPred)) -min(cell2mat(yPred)) + min(cell2mat(yPred)));
%% figure
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
title('Normalized Centre PV Data'),xlabel('Time (5min steps)'),ylabel('')
legend('Original Targets','Network Forecasting','Expected Outputs')
figure
plot([cell2mat(yPred);cell2mat(targetSeriesVal)]')
title('Normalized Centre PV Data'),xlabel('Time (5min steps)'),ylabel('')
legend('Network Forecasting','Expected Outputs')

Risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by