How many LSTM blocks are there in bidirectional LSTM layers?
3 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hi,
How can i relate hidden layers with number of lstm blocks?
inputSize = 17
numHiddenUnits = 50;
numClasses = 2;
maxEpochs = 15;
miniBatchSize = 1;
layers = [ ...
sequenceInputLayer(inputSize)
bilstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer]
options = trainingOptions('adam', ...
'ExecutionEnvironment','auto', ...
'GradientThreshold',1, ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'SequenceLength','longest', ...
'Shuffle','never', ...
'Verbose',0, ...
'Plots','training-progress');
0 Commenti
Risposte (1)
Shantanu Dixit
il 20 Giu 2023
Hi Shweta,
Assuming that by hidden layers you mean numHiddenUnits, the numHiddenUnits refer to the number of LSTM blocks per direction (same for both LSTM and BiLSTM). So here the number of LSTM blocks are 50 (numHiddenUnits = 50).
Refer the documentation for lstm and bilstm layer:
0 Commenti
Vedere anche
Categorie
Scopri di più su Define Shallow Neural Network Architectures in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!