Azzera filtri
Azzera filtri

trainNetwork image train error

12 visualizzazioni (ultimi 30 giorni)
Vitaliy
Vitaliy il 3 Mar 2024
Risposto: Chandrika il 19 Lug 2024 alle 20:04
I am tryeing to train neural network training images.
Here is complete code of my script:
% Clear workspace
clear; close all; clc;
% Specify the directory path you want to delete
directoryPath = 'training_images/resized';
% Check if the directory exists before trying to delete it
if exist(directoryPath, 'dir')
% Remove the directory and its contents
[status, message, messageId] = rmdir(directoryPath, 's');
% Check if the operation was successful
if status
disp(['Directory "', directoryPath, '" has been deleted successfully.']);
else
disp(['Failed to delete "', directoryPath, '": ', message]);
end
else
disp(['Directory "', directoryPath, '" does not exist.']);
end
% All images
imds = imageDatastore( ...
'training_images', ...
'IncludeSubfolders', ...
true ...
);
% Define class names and their corresponding IDs
classNames = ["Lesion","Background"];
labelIDs =[255,0];
% Create a pixelLabelDatastore holding the ground truth pixel labels
pxds=pixelLabelDatastore( ...
'training_images', ...
classNames, ...
labelIDs ...
);
% Create a pixel label image datastore of all images
pximds=pixelLabelImageDatastore(imds,pxds);
% Number of Images
total_num_images=length(pximds.Images);
% Visualize random images
perm=randperm(total_num_images,4);
figure;
% Visualize the images with Mask
for idx = 1:length(perm)
% Read the image from imds
image = readimage(imds, perm(idx));
% Read the corresponding mask from pxds
mask = readimage(pxds, perm(idx));
% Display the image
subplot(2,2,idx);
imshow(image);
hold on;
binaryMask = mask(:,:,1) == classNames(1);
% Visualize boundaries on the binary mask
visboundaries(binaryMask, 'Color', 'r');
% Extract the filename for the title from imds
[~, filename] = fileparts(imds.Files{perm(idx)});
title(sprintf('%s', filename), 'Interpreter', "none");
end
% Desired Image Size
imageSize=[224 224 3];
% Create a pixel label image datastore of all resized images
% Specify directories for resized images and labels
resizedImagesDir = 'training_images/images-segmantation/resized/images';
resizedLabelsDir = 'training_images/images-segmantation/resized/labels';
% Ensure the directories exist
if ~exist(resizedImagesDir, 'dir'), mkdir(resizedImagesDir); end
if ~exist(resizedLabelsDir, 'dir'), mkdir(resizedLabelsDir); end
% Resize and save images
for i = 1:length(imds.Files)
img = imread(imds.Files{i});
resizedImg = imresize(img, 'OutputSize', imageSize(1:2));
[~, fileName, ext] = fileparts(imds.Files{i});
imwrite(resizedImg, fullfile(resizedImagesDir, [fileName, ext]));
end
% Resize and save labels
for i = 1:length(pxds.Files)
label = imread(pxds.Files{i});
resizedLabel = imresize(label, 'OutputSize', imageSize(1:2), 'Method', 'nearest');
[~, fileName, ext] = fileparts(pxds.Files{i});
imwrite(resizedLabel, fullfile(resizedLabelsDir, [fileName, ext]));
end
% Create new ImageDatastore and PixelLabelDatastore from the resized data
imdsResized = imageDatastore(resizedImagesDir);
pximdsResz = pixelLabelDatastore(resizedLabelsDir, classNames, labelIDs);
% Clear all variables except the necessary variables
% clearvars -except pximdsResz classNames total_num_images imageSize
% Split the dataset into training, validation, and testing sets
numImages = numel(imdsResized.Files);
% Adjust the number as needed
testIdx = randperm(numImages, 5);
trainValidIdx = setdiff(1:numImages, testIdx);
% Adjust the number as needed
validIdx = trainValidIdx(randperm(length(trainValidIdx), 10));
trainIdx = trainValidIdx;
% Create datastores for training, validation, and testing sets
imdsTrain = subset(imdsResized, trainIdx);
imdsValid = subset(imdsResized, validIdx);
imdsTest = subset(imdsResized, testIdx);
pximdsTrain = subset(pximdsResz, trainIdx);
pximdsValid = subset(pximdsResz, validIdx);
pximdsTest = subset(pximdsResz, testIdx);
% Combine the validation image datastore with the pixel label datastore
% ValidationData expects a cell array with these combined datastores
validDS = combine(imdsValid, pximdsValid);
% As of MATLAB R2021a, this might not be necessary, and you can directly use validDS
validationData = {validDS};
% Applying the corrected transformation
trainDSConverted = transform(pximdsTrain, @(c) categoricalToNumeric(c{1}, classNames));
validDSConverted = transform(pximdsValid, @(c) categoricalToNumeric(c{1}, classNames));
% trainDSConverted2 = transform(trainDS, @(data) transformData(data));
% validDSConverted2 = transform(validDS, @(data) transformData(data));
% trainDSConverted3 = transform(trainDS, @transformData2);
% validDSConverted3 = transform(validDS, @transformData2);
% For imageDatastore
imdsTrain = subset(imdsResized, trainIdx);
imdsValid = subset(imdsResized, validIdx);
imdsTest = subset(imdsResized, testIdx);
% For pixelLabelDatastore
pxdsTrain = subset(pximdsResz, trainIdx);
pxdsValid = subset(pximdsResz, validIdx);
pxdsTest = subset(pximdsResz, testIdx);
pximdsTrain = pixelLabelImageDatastore(imdsTrain, pxdsTrain);
pximdsValid = pixelLabelImageDatastore(imdsValid, pxdsValid);
% Specify network architecture
numClasses = numel(classNames);
% Adjust based on your actual image size
imageSize = [224 224 3];
% Example using ResNet-18 backbone
lgraph = deeplabv3plusLayers(imageSize, numClasses, 'resnet50');
% Define the parameters for the network
options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.03, ...
'Momentum', 0.9, ...
'L2Regularization', 0.0005, ...
'MaxEpochs', 20, ...
'MiniBatchSize', 32, ...
'VerboseFrequency', 20, ...
'LearnRateSchedule', 'piecewise', ...
'ExecutionEnvironment', 'cpu', ...
'Shuffle', 'every-epoch', ...
'ValidationData', validDSConverted, ... % Correct specification of ValidationData
'ValidationFrequency', 50, ...
'ValidationPatience', 4, ...
'Plots', 'training-progress', ...
'GradientThresholdMethod', 'l2norm', ...
'GradientThreshold', 0.05);
% Train the network
net = trainNetwork(trainDSConverted, lgraph, options);
% Semantic segmentation of test dataset based on the trained network
[pxdspredicted]=semanticseg(pximdsTest,net,'WriteLocation',tempdir);
% Evaluation
metrics=evaluateSemanticSegmentation(pxdspredicted,pximdsTest);
% Normalized Confusion Matrix
normConfMatData = metrics.NormalizedConfusionMatrix.Variables;
figure
h=heatmap(classNames,classNames,100*normConfMatData);
h.XLabel='Predicted Class';
h.YLabel='True Class';
h.Title='Normalized Confusion Matrix (%)';
% Number of Images
num_test_images=length(pximdsTest.Images);
% Pick any random 2 images
perm=randperm(num_test_images,2);
% Visualize the images with Mask
for idx=1:length(perm)
% Extract filename for the title
[~,filename]=fileparts(pximdsTest.Images{idx});
% Read the original file and resize it for network purposes
I = imread(pximdsTest.Images{perm(idx)});
I = imresize(I,[imageSize(1) imageSize(2)],'bilinear');
figure;
image(I);
hold on;
% Read the actual mask and resize it for visualization
actual_mask=imread(pximdsTest.PixelLabelData{perm(idx)});
actual_mask=imresize(actual_mask,[imageSize(1) imageSize(2)],'bilinear');
% Ground Truth
visboundaries(actual_mask,'Color','r');
% Predicted by the Algorithm
predicted_image=(uint8(readimage(pxdspredicted,perm(idx)))); % Values are 1 and 2
predicted_results=uint8(~(predicted_image-1)); % Conversion to binary and reverse the polarity to match with the labelIds
visboundaries(predicted_results,'Color','g');
title(sprintf('%s Red- Actual, Green - Predicted',filename),'Interpreter',"none");
imwrite(mat2gray(predicted_results),sprintf('%s.png',filename));
end
% Corrected transformation function
function outImg = categoricalToNumeric(inCategorical, classNames)
outImg = zeros(size(inCategorical,1), size(inCategorical,2), 'uint8');
for k = 1:length(classNames)
outImg(inCategorical == classNames{k}) = k-1;
end
end
function [img, label] = transformData(data)
% Assumes data is a cell with {image, label} format
img = data{1}; % Image data
inCategorical = data{2}; % Categorical label data
classNames = ["Lesion","Background"]; % Class names as used previously
outImg = zeros(size(inCategorical,1), size(inCategorical,2), 'uint8');
for k = 1:length(classNames)
outImg(inCategorical == classNames(k)) = k-1;
end
label = outImg; % Numeric label data
end
function dataOut = transformData2(dataIn)
img = dataIn{1}; % Assuming dataIn{1} is the image
label = dataIn{2}; % Assuming dataIn{2} is the label in numeric format
% Your conversion logic here, ending with...
labelCategorical = categorical(label, 0:max(label(:)), {'Background', 'Lesion'});
dataOut = {img, labelCategorical};
end
When I run it, I get an error at this part:
% Define the parameters for the network
options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.03, ...
'Momentum', 0.9, ...
'L2Regularization', 0.0005, ...
'MaxEpochs', 20, ...
'MiniBatchSize', 32, ...
'VerboseFrequency', 20, ...
'LearnRateSchedule', 'piecewise', ...
'ExecutionEnvironment', 'cpu', ...
'Shuffle', 'every-epoch', ...
'ValidationData', validDSConverted, ... % Correct specification of ValidationData
'ValidationFrequency', 50, ...
'ValidationPatience', 4, ...
'Plots', 'training-progress', ...
'GradientThresholdMethod', 'l2norm', ...
'GradientThreshold', 0.05);
The value of 'ValidationData' is invalid. Invalid transform function defined on datastore.
this = this@nnet.cnn.TrainingOptionsMiniBatch(args.Definition,varargin{:});
opts = nnet.cnn.TrainingOptionsSGDM(varargin{:});
Caused by:
Attempt to grow array along ambiguous dimension.

Risposte (1)

Chandrika
Chandrika il 19 Lug 2024 alle 20:04
Hello,
Upon going through your code, I understood that the error "Attempt to grow array along ambiguous dimension" is likely caused by the "transform" function not producing the expected output dimensions for the 'validDSConverted' datastore.
Please note that this issue is occurring because the "transform" function applied to the 'pximdsValid' to obtain 'validDSConverted' should have been applied to a combined datastore. However, 'pximdsValid' is a pixel label datastore.
To fix this issue, you should use the combined datastore 'validDS' that includes both the image and pixel label datastores for the validation data. Apply the "transform" function to this combined datastore to compute the 'validDSConverted' datastore.
Additionally, ensure that your training data is also passed as a combined datastore so that there is no dimension mismatch issue encountered during the training.
For more information on preparing the training and validation datastore for training a Deep Learning network, I would further recommend you go through the 'Prepare Training, Validation, and Test Sets' and 'Data Augmentation' sections within the following MathWorks documentation:
Hope you find the above provided explanation useful!
Regards,
Chandrika

Prodotti


Release

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by