MATLAB Demo MerchData reproducibility Problem
10 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Meshooo
il 13 Mar 2019
Modificato: Walter Roberson
il 10 Feb 2025 alle 0:47
Dear all,
I am trying the MATLAB demo for Transfer Learning using Alexnet found here:
However, I couldn't reproduce the same result when using the same dataset. My code as follows:
% Inputes
Number_of_Images_Each_Folder = 14;
Number_of_Classes = 5;
Starting_T = 1;
Experiment_Address = 'I:\DeepLearningDemos_1'; % parent directory of MerchData folder
%***********************************
Starting_T = Starting_T + 2; % because "files" strats from 3
Number_of_T = Number_of_Classes +2; % because "files" strats from 3
%*************************************
%
Root = pwd;
%
files = dir(Root);
net = alexnet;
layers = net.Layers;
Image_Data = '\MerchData';
cd (Experiment_Address) % go to the images address
layers(23) = fullyConnectedLayer(Number_of_Classes);
layers(25) = classificationLayer;
%%
allImages = imageDatastore('MerchData', 'IncludeSubfolders', true, 'LabelSource', 'foldernames');
[imdsTrain, imdsValidation] = splitEachLabel(allImages, 0.8, 'randomize');
%************************************************************************************************
%%
layersTransfer = net.Layers(1:end-3);
%%
layers = [
layersTransfer
fullyConnectedLayer(Number_of_Classes,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)
softmaxLayer
classificationLayer];
%%
inputSize = net.Layers(1).InputSize
%%
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
'DataAugmentation',imageAugmenter);
%%
augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
%%
options = trainingOptions('sgdm', ...
'MiniBatchSize',10, ...
'MaxEpochs',6, ...
'InitialLearnRate',1e-4, ...
'Shuffle','every-epoch', ...
'ValidationData',augimdsValidation, ...
'ValidationFrequency',3, ...
'Verbose',false, ...
'Plots','training-progress');
%%
rng(0)
netTransfer = trainNetwork(augimdsTrain,layers,options);
%%
save net
% Measure network accuracy
predictedLabels = classify(netTransfer, imdsValidation);
accuracy = mean(predictedLabels == imdsValidation.Labels)
% End of training
%**************************************************************************
%**************************************************************************
%%
Result_Table = {};
for T = Starting_T:Number_of_T
%
allImages_Group_B = imageDatastore('MerchData', 'IncludeSubfolders', true, 'LabelSource', 'foldernames');
dir_Group_B = allImages_Group_B.Files{Number_of_Images_Each_Folder*(T-2),1};
[Testing_Images_Path, name, ext] = fileparts(dir_Group_B);
cd (Testing_Images_Path)
for Tr = 1:Number_of_Images_Each_Folder
fileList = getAllFiles(Testing_Images_Path);
Testing_Image = imread(fileList{Tr,1});
First_Col_Table (Tr) = Tr;
p = predict(netTransfer, Testing_Image);
[p3,i3] = maxk(p,5); %Top 5 predictions
netTransfer.Layers(end)
All_posibilities = netTransfer.Layers(end).ClassNames(i3)
%***********************************************************
First_high_Posibility_1 = All_posibilities{1,1};
First_high_Posibility(Tr) = {First_high_Posibility_1};
Second_high_Posibility_2 = All_posibilities{2,1};
Second_high_Posibility(Tr) = {Second_high_Posibility_2};
Third_high_Posibility_3 = All_posibilities{3,1};
Third_high_Posibility (Tr) = {Third_high_Posibility_3};
Fourth_high_Posibility_4 = All_posibilities{4,1};
Fourth_high_Posibility (Tr) = {Fourth_high_Posibility_4};
Fifth_high_Posibility_5 = All_posibilities{5,1};
Fifth_high_Posibility (Tr) = {Fifth_high_Posibility_5};
end
First_high_Pos = First_high_Posibility';
Second_high_Pos = Second_high_Posibility';
Third_high_Pos = Third_high_Posibility';
Fourth_high_Pos = Fourth_high_Posibility';
Fifth_high_Pos = Fifth_high_Posibility';
New_ResultsX = [First_high_Pos, Second_high_Pos, Third_high_Pos, Fourth_high_Pos, Fifth_high_Pos];
cd (Root) % go back to the parent folder
xlswrite('All_possibilities_MerchData', New_ResultsX, T-2)
end
You will need (getAllFiles) function to import all the images in the current folder
function fileList = getAllFiles(dirName)
dirData = dir(dirName); %# Get the data for the current directory
dirIndex = [dirData.isdir]; %# Find the index for directories
fileList = {dirData(~dirIndex).name}'; %'# Get a list of the files
if ~isempty(fileList)
fileList = cellfun(@(x) fullfile(dirName,x),... %# Prepend path to files
fileList,'UniformOutput',false);
end
subDirs = {dirData(dirIndex).name}; %# Get a list of the subdirectories
validIndex = ~ismember(subDirs,{'.','..'}); %# Find index of subdirectories
%# that are not '.' or '..'
for iDir = find(validIndex) %# Loop over valid subdirectories
nextDir = fullfile(dirName,subDirs{iDir}); %# Get the subdirectory path
fileList = [fileList; getAllFiles(nextDir)]; %# Recursively call getAllFiles
end
end
If you run this code two times and save the excel file results with different names, you can compare and see that results are different from each other. Any idea on how to solve this problem?
Any comment will be appreciated.
Meshoo
0 Commenti
Risposta accettata
Naoya
il 15 Mar 2019
Unfortunately, there is no way to obtain reproducible results on GPU mode, even though the user specfied same seed for random number on MATLAB side.
0 Commenti
Più risposte (1)
Naoya
il 14 Mar 2019
The irreproducibility that you reported is due to the non-determinism of the cuDNN routines which Deep Learning Toolbox uses when training on the GPU. The non-deterministic routines are restricted to the backward methods, so a forward pass should always be reproducible.
The simplest way to guarantee reproducible training runs is to train on the CPU, though the tranfer learning script did not take long time to train on CPU.
3 Commenti
Vishnu Keyen
il 6 Feb 2025 alle 10:21
Will the same exact results be obtained when using a dropout layer in the model??
setting the random seed (rng('default') and gpurng('default')) does not reproduce the results for me while using the dropoutLayer() in my model. Without the dropoutLayer() I am able to reproduce the results.
Your thoughts would be appreciated
Naoya
il 10 Feb 2025 alle 0:26
Hi Vishnu,
When executing on a GPU, it is generally assumed that the consistency of results will not be maintained, regardless of the presence of a dropoutLayer. However, if you are using R2024b, you can maintain the consistency of results when performing GPU calculations by using the deep.gpu.deterministicAlgorithms function as follows.
previousState = deep.gpu.deterministicAlgorithms(true);
rng("default")
gpurng("default")
Vedere anche
Categorie
Scopri di più su Image Data Workflows in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!