Using Contrast Limited Adaptive Histograph Equalization for data augmentation

3 visualizzazioni (ultimi 30 giorni)
Dear community
I'm trying to apply transfer learning in Yolov3 pretrained model for my custom data. I wanted to add CLAHE feature beside traditional Yolov3 data augmentation. Here is the code:
rng(0);
shuffledIndices = randperm(height(vehicleDataset));
idx = floor(0.6 * length(shuffledIndices));
trainingDataTbl = vehicleDataset(shuffledIndices(1:idx), :);
testDataTbl = vehicleDataset(shuffledIndices(idx+1:end), :);
imdsTrain = imageDatastore(trainingDataTbl.imageFilename);
imdsTest = imageDatastore(testDataTbl.imageFilename);
bldsTrain = boxLabelDatastore(trainingDataTbl(:, 2:end));
bldsTest = boxLabelDatastore(testDataTbl(:, 2:end));
trainingData = combine(imdsTrain, bldsTrain);
testData = combine(imdsTest, bldsTest);
augmentedTrainingData = transform(trainingData, @augmentData);
function data = augmentData(A)
data = cell(size(A));
for ii = 1:size(A,1)
I = A{ii,1};
bboxes = A{ii,2};
labels = A{ii,3};
sz = size(I);
if numel(sz) == 3 && sz(3) == 3
I = jitterColorHSV(I,...
'Contrast',0.0,...
'Hue',0.1,...
'Saturation',0.2,...
'Brightness',0.2);
end
% Randomly flip image.
tform = randomAffine2d('XReflection',true,'Scale',[1 1.1]);
rout = affineOutputView(sz,tform,'BoundsStyle','centerOutput');
I = imwarp(I,tform,'OutputView',rout);
% Apply same transform to boxes.
[bboxes,indices] = bboxwarp(bboxes,tform,rout,'OverlapThreshold',0.25);
labels = labels(indices);
% Return original data only when all boxes are removed by warping.
if isempty(indices)
data(ii,:) = A(ii,:);
else
data(ii,:) = {I, bboxes, labels};
end
end
end
function data = preprocessData(data, targetSize)
% Resize the images and scale the pixels to between 0 and 1. Also scale the
% corresponding bounding boxes.
for ii = 1:size(data,1)
I = data{ii,1};
imgSize = size(I);
% Convert an input image with single channel to 3 channels.
if numel(imgSize) < 3
I = repmat(I,1,1,3);
end
bboxes = data{ii,2};
I = im2single(imresize(I,targetSize(1:2)));
scale = targetSize(1:2)./imgSize(1:2);
bboxes = bboxresize(bboxes,scale);
data(ii, 1:2) = {I, bboxes};
end
end
I wanted to add the CLAHE (https://www.mathworks.com/help/images/ref/adapthisteq.html) to this perticular section but I'm keeping get error. Can anyone help me through?
Best regards.
  3 Commenti
MirPooya Salehi Moharer
MirPooya Salehi Moharer il 25 Mag 2021
I'm trying to adapt the code for data augmentation process. I'm currently working on medical data processing. I added the syntax to function section but cannot implament it. That's why I asked for help.
DGM
DGM il 26 Mag 2021
You didn't answer either question. If you tried to use it somewhere, I can't guess how you did. I can't guess what error you got. All I know is that you did something and it didn't work.
If for some reason you're trying to use it on RGB or multipage/multiframe images, then you'll need to do it one channel/page/frame at a time.

Accedi per commentare.

Risposta accettata

Shashank Gupta
Shashank Gupta il 26 Mag 2021
Modificato: Shashank Gupta il 26 Mag 2021
Hi,
I see you want to use CLAHE as one of the data augmentation technique, you can simply add the adapthisteq function in the augmentData function as descibed in one of the comment by @DGM. Below is a reference code for you.
% Everything inside the augmentData remain same, just add CLAHE.
function data = augmentData(A)
data = cell(size(A));
for ii = 1:size(A,1)
I = A{ii,1};
bboxes = A{ii,2};
labels = A{ii,3};
J = adapthisteq(I);
if isempty(indices)
data(ii,:) = A(ii,:);
else
data(ii,:) = {I, bboxes, labels};
end
end
end
I hope this helps.
Cheers.
  1 Commento
MirPooya Salehi Moharer
MirPooya Salehi Moharer il 26 Mag 2021
Thank you so much for your feedback. I assume adapthisteq is working on grayscale images. I converted the imdsTrain to grayscale images using augmentedImageDatastore. But when I wanted to view the augmented images I get the " All tables horizontally concatenated must have the same number of rows".
I post the the code below. I'd be enormously happy if you can help me through.
rng(0);
shuffledIndices = randperm(height(vehicleDataset));
idx = floor(0.69 * length(shuffledIndices));
trainingDataTbl = vehicleDataset(shuffledIndices(1:idx), :);
testDataTbl = vehicleDataset(shuffledIndices(idx+1:end), :);
imdsTrain = imageDatastore(trainingDataTbl.imageFilename);
imdsTest = imageDatastore(testDataTbl.imageFilename);
audsTrain = augmentedImageDatastore([227 227], imdsTrain,"ColorPreprocessing","rgb2gray");
audsTest = augmentedImageDatastore([227 227], imdsTest,"ColorPreprocessing","rgb2gray");
bldsTrain = boxLabelDatastore(trainingDataTbl(:, 2:end));
bldsTest = boxLabelDatastore(testDataTbl(:, 2:end));
trainingData = combine(audsTrain, bldsTrain);
testData = combine(audsTest, bldsTest);
augmentedTrainingData = transform(trainingData, @augmentData);
% Visualize the augmented images.
augmentedData = cell(4,1);
for k = 1:4
data = read(augmentedTrainingData)
augmentedData{k} = insertShape(data{1,1}, 'Rectangle', data{1,2});
reset(augmentedTrainingData);
end
figure
montage(augmentedData, 'BorderSize', 10)
function data = augmentData(A)
% Apply random horizontal flipping, and random X/Y scaling. Boxes that get
% scaled outside the bounds are clipped if the overlap is above 0.25. Also,
% jitter image color.
data = cell(size(A));
for ii = 1:size(A,1)
I = A{ii,1};
bboxes = A{ii,2};
labels = A{ii,3};
sz = size(I);
if isempty(indices)
data(ii,:) = A(ii,:);
else
data(ii,:) = {I, bboxes, labels};
I = adapthisteq(I,'clipLimit',0.02,'Distribution','rayleigh');
end
end
% Randomly flip image.
tform = randomAffine2d('XReflection',true,'Scale',[1 1.1]);
rout = affineOutputView(sz,tform,'BoundsStyle','centerOutput');
I = imwarp(I,tform,'OutputView',rout);
% Apply same transform to boxes.
[bboxes,indices] = bboxwarp(bboxes,tform,rout,'OverlapThreshold',0.25);
labels = labels(indices);
% Return original data only when all boxes are removed by warping.
if isempty(indices)
data(ii,:) = A(ii,:);
else
data(ii,:) = {I, bboxes, labels};
end
end
function data = preprocessData(data, targetSize)
% Resize the images and scale the pixels to between 0 and 1. Also scale the
% corresponding bounding boxes.
for ii = 1:size(data,1)
I = data{ii,1};
imgSize = size(I);
% Convert an input image with single channel to 3 channels.
if numel(imgSize) < 3
I = repmat(I,1,1,3);
end
bboxes = data{ii,2};
I = im2single(imresize(I,targetSize(1:2)));
scale = targetSize(1:2)./imgSize(1:2);
bboxes = bboxresize(bboxes,scale);
data(ii, 1:2) = {I, bboxes};
end
end
function [XTrain, YTrain] = createBatchData(data, groundTruthBoxes, groundTruthClasses, classNames)
% Returns images combined along the batch dimension in XTrain and
% normalized bounding boxes concatenated with classIDs in YTrain
% Concatenate images along the batch dimension.
XTrain = cat(4, data{:,1});
% Get class IDs from the class names.
classNames = repmat({categorical(classNames')}, size(groundTruthClasses));
[~, classIndices] = cellfun(@(a,b)ismember(a,b), groundTruthClasses, classNames, 'UniformOutput', false);
% Append the label indexes and training image size to scaled bounding boxes
% and create a single cell array of responses.
combinedResponses = cellfun(@(bbox, classid)[bbox, classid], groundTruthBoxes, classIndices, 'UniformOutput', false);
len = max( cellfun(@(x)size(x,1), combinedResponses ) );
paddedBBoxes = cellfun( @(v) padarray(v,[len-size(v,1),0],0,'post'), combinedResponses, 'UniformOutput',false);
YTrain = cat(4, paddedBBoxes{:,1});
end

Accedi per commentare.

Più risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by