Output Function to Save Net on Every Validation

10 visualizzazioni (ultimi 30 giorni)
Grant Anderson
Grant Anderson il 6 Mag 2020
Commentato: Ameer Hamza il 12 Mag 2020
I'm curious if it's possible to define an output function to spit out the current state of the network while training by using an output function to put that current net into a structure in the same way I have it defined to spit out [net,tr] = trainNetwork() when it finishes, but does so during training.
I can't use checkpoints because I am using an ADAM solver for my network.
1: Net,TR
2: Net, TR
3: Net, TR
4: Net, TR
etc.
  1 Commento
Ameer Hamza
Ameer Hamza il 6 Mag 2020
It seems that the outputFcn cannot save the network itself after each iteration. Is saving just the state of network training enough?

Accedi per commentare.

Risposte (1)

Ameer Hamza
Ameer Hamza il 6 Mag 2020
Modificato: Ameer Hamza il 6 Mag 2020
If you just want to save the training states, then try the following example. It is adapted from this example: https://www.mathworks.com/help/releases/R2020a/deeplearning/ref/trainingoptions.html#bvniuj4
[XTrain,YTrain] = digitTrain4DArrayData;
idx = randperm(size(XTrain,4),1000);
XValidation = XTrain(:,:,:,idx);
XTrain(:,:,:,idx) = [];
YValidation = YTrain(idx);
YTrain(idx) = [];
layers = [
imageInputLayer([28 28 1])
convolution2dLayer(3,8,'Padding','same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,16,'Padding','same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,32,'Padding','same')
batchNormalizationLayer
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',8, ...
'ValidationData',{XValidation,YValidation}, ...
'ValidationFrequency',30, ...
'Verbose',false, ...
'Plots','training-progress', ...
'OutputFcn', @outFcn);
global training_state
training_state = [];
net = trainNetwork(XTrain,YTrain,layers,options);
function stop = outFcn(info)
global training_state
training_state = [training_state info];
stop = false;
end
Use of the global variable can be avoided if you define your own handle class and pass it to the outFcn. However, if you are fine with the use of global, then it shouldn't be an issue.
  4 Commenti
Grant Anderson
Grant Anderson il 11 Mag 2020
Modificato: Grant Anderson il 11 Mag 2020
This is the function in which I feed in some parameters to train a neural network. It allows me to re-size the fully-connected-layers and neurons.
function [net,tr] = betNet(X,y,X_test,y_test,X_cv,y_cv,maxE,NHL,fcls)
%% ===== Setting up DNN =====
%Sets up our FCL
fcl1 = fullyConnectedLayer(fcls,'BiasInitializer','narrow-normal');
fcl2 = fullyConnectedLayer(2,'BiasInitializer','ones');
ip = sequenceInputLayer(size(X,1),'Normalization','zerocenter');
sml = softmaxLayer('Name','sml');
options = trainingOptions('adam',...
'MaxEpochs',maxE,...
'ExecutionEnvironment','gpu',...
'Shuffle','every-epoch',...
'MiniBatchSize',64,...
'ValidationFrequency',50,...
'ValidationData',{X_cv,y_cv},...
'OutputFcn', @outFcn)
layers = [ip repmat(fcl1,1,NHL) fcl2 softmaxLayer classificationLayer];
%
global training_state
training_state = [];
%% ===== Training NN =====
[net,tr] = trainNetwork(X,y,layers,options);
function stop = outFcn(info)
global training_state
training_state = [training_state info];
stop = false;
end
end
Ameer Hamza
Ameer Hamza il 12 Mag 2020
If you want to check the value of training_state in the base workspace after the execution of your function, then you should also run the following line in the command window before calling your function.
global training_state

Accedi per commentare.

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by