Error with CNN and LSTM network

15 visualizzazioni (ultimi 30 giorni)
Darrien Walters
Darrien Walters il 3 Dic 2020
Commentato: Vinay Kulkarni il 13 Mar 2023
Good day,
I am attempting to do a combined cnn and lstm network with the following layers:
tempLayers = [
sequenceInputLayer(InputSize,"Name","sequence")
sequenceFoldingLayer("Name","seqfold")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
% Layer 1: 3 filters, stride of 1, length of filter is 102, no padding.
convolution2dLayer([40 1],32,'Stride',1,"Name","conv_1")
batchNormalizationLayer("Name","batchnorm_1")
leakyReluLayer("Name","relu_1")
maxPooling2dLayer([4 1],'Padding',"same","Name","maxpool_1")
dropoutLayer(0.1,"Name","dropout_1")
convolution2dLayer([40 1],32,'Stride',1,"Name","conv_2")
batchNormalizationLayer("Name","batchnorm_2")
leakyReluLayer("Name","relu_2")
maxPooling2dLayer([4 1],'Padding',"same","Name","maxpool_2")
dropoutLayer(0.1,"Name","dropout_2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(128,"Name","lstm_1","OutputMode","last")
lstmLayer(128,"Name","lstm_2","OutputMode","last")
fullyConnectedLayer(1,"Name","fc")
%softmaxLayer("Name","softmaxlayer")
%classificationLayer("Name","classificationoutput")
regressionLayer("Name","regressionoutput")
];
lgraph = addLayers(lgraph,tempLayers);
%% Connect Layer Branches
clear tempLayers;
lgraph = connectLayers(lgraph,"seqfold/out","conv_1");
lgraph = connectLayers(lgraph,"seqfold/miniBatchSize","sequnfold/miniBatchSize");
lgraph = connectLayers(lgraph,"dropout_2","sequnfold/in");
However when i try to train the network using a train input that is a 4d double and output that is a 200 column vector with hr data i receive the following error:
"Error using trainNetwork (line 183)
Invalid training data. For a recurrent layer with output mode 'last', inputs must be cell arrays.
Error in ecng_6700_cw1_hw4_test_codem (line 235)
net = trainNetwork(train_input,estimator_train_output,lgraph,opts);"
I am unsure what the issue is with my data in trying to train it.
  2 Commenti
James Lu
James Lu il 4 Feb 2022
have you tried changing the first LSTM layer to
lstmLayer(128,"Name","lstm_1","OutputMode","sequence")
Vinay Kulkarni
Vinay Kulkarni il 13 Mar 2023
Tried this, but getting error as :
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);
Caused by:
Layer 'LSTM1': LSTM layers must have scalar input size, but input size (32×16) was received. Try using a flatten layer before the LSTM layer.
And with addition of flatten layer:
Error using trainNetwork (line 184)
The training sequences are of feature dimension 653956 32 but the input layer expects
sequences of feature dimension 32 16.
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);

Accedi per commentare.

Risposte (1)

yanqi liu
yanqi liu il 8 Feb 2022
yes,sir,as James Lu idea,may be use
tempLayers = [
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(128,"Name","lstm_1","OutputMode","sequence")
lstmLayer(128,"Name","lstm_2","OutputMode","sequence")
fullyConnectedLayer(1,"Name","fc")
%softmaxLayer("Name","softmaxlayer")
%classificationLayer("Name","classificationoutput")
regressionLayer("Name","regressionoutput")
];
or make data to cells,such as
[XTrain,YTrain] = japaneseVowelsTrainData;
XTrain
XTrain = 270×1 cell array
{12×20 double} {12×26 double} {12×22 double} {12×20 double} {12×21 double} {12×23 double} {12×22 double} {12×18 double} {12×24 double} {12×15 double} {12×23 double} {12×15 double} {12×17 double} {12×14 double} {12×14 double} {12×15 double}
now we can see the cell data,then you can use origin net layers to try
  1 Commento
Vinay Kulkarni
Vinay Kulkarni il 13 Mar 2023
Tried your above suggestion of adding sequenceunfolding and flattening layers, but still getting errors:
such as
layers=[
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(32,"Name","LSTM1","OutputMode","sequence")
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);
Caused by:
Network: Missing input layer. The network must have at least one input layer.
Layer 'sequnfold': Unconnected input. Each layer input must be connected to the output of anoth

Accedi per commentare.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by