Is a Bi-GRU available - bidirectional Gated Recurrent Unit (GRU) - or a way to implement a Bi-GRU?

37 visualizzazioni (ultimi 30 giorni)
The following artificial recurrent neural network (RNN) architectures are available:
Whereas, I would like to know if an Bi-GRU exists or can be defined?
Thank you for your help.

Risposta accettata

Amanjit Dulai
Amanjit Dulai il 21 Ott 2021
Modificato: Amanjit Dulai il 22 Nov 2022
A bi-LSTM layer works by applying two LSTM layers on the data; one in the forward direction and one in the reverse direction. You can apply an LSTM function in the reverse direction by flipping the data. The results from these two LSTM layers is then concatenated together to form the output of the bi-LSTM layer. So if we want to implement a bi-GRU layer, we can do this by using a custom flip layer together with GRU layers. A custom flip layer can be implemented as follows:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end
We can implement a bi-GRU layer with OutputMode="sequence" by arranging layers in the way shown below:
We can implement a bi-GRU layer with with OutputMode="last" by arranging layers in the way shown below:
Below is a short example showing how to use the custom flip layer mentioned above to implement a network with two bi-GRU layers (one with OutputMode="sequence" and oine with OutputMode="last"):
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9)
softmaxLayer
classificationLayer()] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)
  7 Commenti
Amanjit Dulai
Amanjit Dulai il 21 Giu 2022
The third dimension is the time dimension. For a bi-GRU, we want one GRU layer to operate on the sequence in the forward time direction, and then we want one GRU layer to operate on the sequence in the reverse time direction. We can get it to operate on the sequence in the reverse time direction by flipping in the third dimension.

Accedi per commentare.

Più risposte (5)

Ronny Guendel
Ronny Guendel il 22 Ott 2021
The full working code for me:
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'classoutput')] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)

义
il 22 Nov 2022
why i can't find the FlipLayer in matlab2022a?
  2 Commenti
Amanjit Dulai
Amanjit Dulai il 22 Nov 2022
FlipLayer is a custom layer you need to implement. The code is shown below:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end

Accedi per commentare.


义
il 25 Nov 2022
If my data is a 1*n power system load data, how should I set this parameter?

song
song il 28 Giu 2023
If my data is a 1*n bearing system load data, how should I set this parameter?

Artem Lensky
Artem Lensky il 17 Ago 2023
Modificato: Artem Lensky il 17 Ago 2023
Below is a multiblock BiGRU implementation with layer normalization and dropout layers.
The models takes the following parameters
numFeatures, numBlocks, numHiddenUnits, numResponse, dropoutFactor
and the function constructing BiGRU is implemented as follows:
function lgraph = constructBiGRU(numFeatures,numBlocks,numHiddenUnits,numResponse,dropoutFactor)
arguments
numFeatures = 1,
numBlocks = 1,
numHiddenUnits = 32,
numResponse = 3,
dropoutFactor = 0,
end
layer = [sequenceInputLayer(numFeatures,'Name','sequenceInputLayer')];
lgraph = layerGraph(layer);
outputName = lgraph.Layers(end).Name;
for i = 1:numBlocks
layers = [gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_1_" + i)
concatenationLayer(1, 2, Name="cat_" + i)
dropoutLayer(dropoutFactor, Name="dropout"+ i)
layerNormalizationLayer(Name="layernorm_" + i)];
lgraph = addLayers(lgraph, layers);
layers = [FlipLayer("flip_1_" + i)
gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_2_" + i)
FlipLayer("flip_2_" + i)];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, outputName, "gru_1_" + i);
lgraph = connectLayers(lgraph, outputName, "flip_1_" + i);
lgraph = connectLayers(lgraph, "flip_2_" + i, "cat_" + i + "/in2");
outputName = "layernorm_" + i;
end
% Remove last dropout layer. In case the network contains a single
% BiGRU block, no dropout layers will be added at all.
lgraph = removeLayers(lgraph, "layernorm_" + i);
layers = [fullyConnectedLayer(numResponse,'Name','fc')
softmaxLayer
classificationLayer];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, "dropout" + i, "fc");
end

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by