Error using trainnet (line 46)
48 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Bahadir
il 15 Ott 2025 alle 8:58
Commentato: Bahadir
il 17 Ott 2025 alle 20:46
Dear sir;
My XTrain 48941x1 cell and TTrain 48941x1 categorical as a shown at below


why does I get this error?
Error using trainnet (line 46)
Number of observations in predictors (48941) and targets (1) must match. Check that
the data and network are consistent.
layers = [
sequenceInputLayer([30 30 1],'Name','input') % For 2-D image sequence input, InputSize is vector of three elements [h w c], where h is the image height, w is the image width, and c is the number of channels of the image.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,32,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
globalAveragePooling2dLayer(Name="gap1")
fullyConnectedLayer(7)
softmaxLayer];
options = trainingOptions("adam", ...
MaxEpochs=4, ...
InitialLearnRate=0.002,...
MiniBatchSize=128,...
GradientThreshold=1, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=20, ...
LearnRateDropFactor=0.8, ...
L2Regularization=1e-3,...
Shuffle="every-epoch", ...
Plots="training-progress", ...
ObjectiveMetricName="loss", ...
OutputNetwork="best-validation", ...
ValidationPatience=5, ... % Specify the validation patience as 5 so training stops if the recall has not decreased for five iterations.
ValidationFrequency=50, ...
Verbose=false, ...
Metrics="accuracy", ...
ValidationData={XValidation,TValidation});
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
3 Commenti
Walter Roberson
il 15 Ott 2025 alle 21:28
In order to test we would need corresponding XValidation and TValidation
Risposta accettata
Matt J
il 16 Ott 2025 alle 1:16
Modificato: Matt J
il 16 Ott 2025 alle 2:15
It appears that if your XTrain is in cell array form, you need to put your TTrain data in cell form as well:
load('attachedData.mat'); clear ans; whos %Inventory
TTrain=num2cell(TTrain);
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
2 Commenti
Matt J
il 16 Ott 2025 alle 2:15
Modificato: Matt J
il 16 Ott 2025 alle 13:32
You are using a sequenceInputLayer, but your training inputs appear to just be 30x30 images. An imageInputLayer might be more appropriate...
load('attachedData.mat');
XTrain=cat(4,XTrain{:});
layers(1)=imageInputLayer([30,30,1],Name="input");
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(:,:,:,1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Image Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!