Azzera filtri
Azzera filtri

Neural Networks regression using adapt

3 visualizzazioni (ultimi 30 giorni)
Andre Calfa
Andre Calfa il 24 Ott 2015
Commentato: Greg Heath il 25 Ott 2015
Hi,
I have been working on writing different matlab scripts to perform regression using several built-in MATLAB algorithms, such as boosted regression trees, bagged decision trees and neural networks. Because my data set is relatively very large and it takes a good amount of time to train it, I decided trying to use incremental learning and compare the differences between that and training the whole data I have. My question is: the MSE and R^2 when I use adapt are considerably worse compared to when I train the whole training data normally. I was wondering if it is actually supposed to be this way or if there is something wrong with my implementation. Could anyone help me? Here is the code:
% Initializations
clear all;
close all;
clear classes;
trainfraction = 1/7; % e.g. 10 percent
% trainArr will be used for adapt. My training data is separated into years. I will call train only
% over the 2003 data, use adapt for data from 2004 and 2010 and use them to predict data from 2011
% loadData simply puts the .mat into feature matrices and output vectors.
trainArr = {'Data2004.mat', 'Data2005.mat', 'Data2006.mat', 'Data2007.mat', 'Data2008.mat', 'Data2009.mat', 'Data2010.mat' };
[X_Train,Y_Train,T_Train] = loadData({'Data2003.mat'} , trainfraction);
[X_PredYear,Y_PredYear,T_PredYear] = loadData('Data2011.mat', 0);
%%Neural Network
input = X_Train';
output = Y_Train';
net1 = feedforwardnet([20 20]);
net1.trainFcn = 'trainscg';
net1.trainParam.max_fail = 100;
net1 = train(net1, input, output);
y = net1(input);
% Predict
input = X_PredYear';
output = Y_PredYear';
input_test = input;
target_test = output;
predict = sim(net1,input_test);
% Performance
performance = mse(net1, target_test, predict)
%%R Squared
target_test_mean = mean(target_test);
SStot = sum((target_test - target_test_mean).^2);
SSreg = sum((predict - target_test_mean).^2);
SSres = sum((target_test - predict).^2);
Rerr = 1-SSres/SStot
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Here I will use the adapt to increment the learning I got from "train"
for i = 1:numel(trainArr)
[X_Train,Y_Train,T_Train] = loadData(trainArr(i) , trainfraction);
input = X_Train';
output = Y_Train';
net1 = adapt(net1, input, output);
y = net1(input);
predict = sim(net1,input_test);
% Performance
performance = perform(net1, target_test, predict)
%%R Square
target_test_mean = mean(target_test);
SStot = sum((target_test - target_test_mean).^2);
SSreg = sum((predict - target_test_mean).^2);
SSres = sum((target_test - predict).^2);
Rerr = 1-SSres/SStot
end
For the results, when I use train for all training data my MSE is a bit above 0.3 and R^2 around 0.76. Using adapt, after the last call to adapt my MSE is 0.46 and R^2 around 0.66. Any ideas?
  1 Commento
Greg Heath
Greg Heath il 25 Ott 2015
How many random initial weight trials did you run for each case?
Can you demonstrate this using a MATLAB example dataset?
help nndatasets
doc nndatasets
Greg

Accedi per commentare.

Risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Prodotti

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by