Azzera filtri
Azzera filtri

How to improve the validation accuracy of the CNN network in deep learing ?

6 visualizzazioni (ultimi 30 giorni)
How to increase the validation accuracy to more than 90%?
Used Layers and options are following:
layers = [
imageInputLayer([227 227 3],"Name","data")
convolution2dLayer([11 11],94,"Name","conv1","BiasLearnRateFactor",2,"Stride",[4 4])
reluLayer("Name","relu1")
crossChannelNormalizationLayer(5,"Name","norm1","K",1)
maxPooling2dLayer([3 3],"Name","pool1","Stride",[2 2])
groupedConvolution2dLayer([5 5],94,2,"Name","conv2","BiasLearnRateFactor",2,"Padding",[2 2 2 2])
reluLayer("Name","relu2")
crossChannelNormalizationLayer(5,"Name","norm2","K",1)
maxPooling2dLayer([3 3],"Name","pool2","Stride",[2 2])
convolution2dLayer([3 3],94,"Name","conv3","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu3")
groupedConvolution2dLayer([2 2],64,2,"Name","conv4","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu4")
groupedConvolution2dLayer([3 3],128,2,"Name","conv5","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu5")
maxPooling2dLayer([3 3],"Name","pool5","Stride",[2 2])
fullyConnectedLayer(500,"Name","fc6","BiasLearnRateFactor",2)
reluLayer("Name","relu6")
dropoutLayer(0.5,"Name","drop6")
fullyConnectedLayer(500,"Name","fc7","BiasLearnRateFactor",2)
reluLayer("Name","relu7")
dropoutLayer(0.5,"Name","drop7")
fullyConnectedLayer(100,"Name","fc8","BiasLearnRateFactor",2)
fullyConnectedLayer(4,"Name","new fc","BiasLearnRateFactor",10,"WeightLearnRateFactor",10)
softmaxLayer("Name","prob")
classificationLayer("Name","classoutput")];
miniBatchSize =25; % 128
valFrequency = floor(numel(augimdsTrain.Files)/miniBatchSize);
options = trainingOptions('sgdm', ...
'MiniBatchSize',36, ... %32
'MaxEpochs',10, ...
'InitialLearnRate',0.001, ... %0.01
'LearnRateDropFactor',0.1, ...
'Shuffle','every-epoch', ...
'ValidationData',augimdsValidation, ...
'ValidationFrequency',valFrequency, ...
'ValidationPatience',4,'Verbose',false, ...
'Plots','training-progress');

Risposte (1)

Mahesh Taparia
Mahesh Taparia il 14 Dic 2020
Hi
By looking at the loss curve, it seems the loss is not saturated. So you can train with more epochs and check the performance. Also try with adam optimizer, it may improve the performance. Moreover, you can experiment with network architecture and hyperparameters to check if there can be some improvement. For example, add 1-2 more fully connected layers (after layer with 100 nodes). Hope it will help!
  2 Commenti
Mohamed Elbeialy
Mohamed Elbeialy il 14 Dic 2020
I have tried all your suggestions even using Adam optimizer, however, no improvment happened. Do you have further advice?
Mahesh Taparia
Mahesh Taparia il 14 Dic 2020
By looking at this curve, it seems training and validation accuracy improved by 5% (approax). Train with more epochs as the curve is not saturated yet or try with other network architecture.

Accedi per commentare.

Prodotti


Release

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by