trainNetwork reports too many input arguments in 2024a

Transfer learning code, based on the help example, that runs in 2023b, fails in 2024a
Error using trainNetwork (line 191)
Too many input arguments.
What has changed in the 2024a version? I see that trainnet is now recommended and I can do that going forward, but I would expect old code still to run.

3 Commenti

I would also expect it to still work. Please share the full error message (all the red text).
Peter
Peter il 21 Ago 2024
Spostato: Voss il 21 Ago 2024
Error using trainNetwork (line 191)
Too many input arguments.
Error in train_faces2c_resnet50 (line 97)
netTransfer = trainNetwork(augimdsTrain,lgraph,options);
Caused by:
Error using gather
Too many input arguments.
I'm not able to duplicate the error given the information you've shared. Can you provide a working example we can test with? If not, then I'd suggest contacting support: https://www.mathworks.com/support/contact_us.html

Accedi per commentare.

 Risposta accettata

Hitesh
Hitesh il 22 Ago 2024
Modificato: Hitesh il 23 Ago 2024
Hello Peter!
I've replicated your scenario in MATLAB R2024a, and the"trainNetwork"function is providing the expected results. Please refer to the following example code:
numImages = 100; % Number of images
imageSize = [28, 28, 1]; % Image size (e.g., 28x28 pixels, 1 channel for grayscale)
numClasses = 10; % Number of classes (for classification)
X = rand(imageSize(1), imageSize(2), imageSize(3), numImages); % Random images
Y = categorical(randi([1, numClasses], numImages, 1)); % Random labels
imds = arrayDatastore(X, 'IterationDimension', 4); % Create an image datastore
lds = arrayDatastore(Y); % Create a label datastore
combinedDS = combine(imds, lds); % Combine the image and label datastores
% Define a simple network architecture
layers = [
imageInputLayer(imageSize)
convolution2dLayer(3, 8, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2, 'Stride', 2)
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer];
% Set training options
options = trainingOptions('sgdm', ...
'MaxEpochs', 5, ...
'InitialLearnRate', 0.01, ...
'Verbose', false, ...n
'Plots', 'training-progress');
% Train the network using the combined datastore
net = trainNetwork(combinedDS, layers, options);
Here, the command"net = trainNetwork(combinedDS, layers, options)"trains and returns a network trainedNet for a classification problem. "combinedDS" is an ImageDatastore with categorical labels, "layers" is an array of network layers or a LayerGraph, and "options" is a set of training options.
So, the"trainNetwork"function is functioning as anticipated, even though they have introduced the"trainnet"function, which includes a loss function parameter for training.
Please refer to the following documentation of the “trainNetwork” function : (Not recommended) Train neural network - MATLAB trainNetwork (mathworks.com)
I hope this addresses your issue. Please refer the attached images, which include the results, from my replication using the "trainNetwork" function.

3 Commenti

So the question remains, why code that runs under 2023b does not run under 2024a. Something must have changed. I'll see if I can narrow it down
Hi Peter !
Could you share the code or file that will reproduce the same error that you are facing? The "trainNetwork" function seems to be operating as intended in its normal functionality.
Hi Hitesh, thanks for your time on this. Having rebooted the whole machine, rather than just Matlab, my code now runs in 2024a, too. Something pretty weird, since it stopped 2024a and not 2023b, I guess we'll never know.
Thanks, Peter

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Deep Learning Toolbox in Centro assistenza e File Exchange

Prodotti

Release

R2024a

Richiesto:

il 21 Ago 2024

Commentato:

il 26 Ago 2024

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by