What is the order of weights in a LSTM layer with multiple LSTM neurons?
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
The pictue at the bottom of the documentation at https://www.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.lstmlayer.html
says that weights are concatenated in this order W = [Wi Wf Wg Wo]. It does not talk about multiple LSTM neurons.
Take the case of 2 LSTM neurons.
Are the weights concatenated according to :
[Wi_neuron1 Wf_neuron1 Wg_neuron1 Wo_neuron1 Wi_neuron2 Wf_neuron2 Wg_neuron2 Wo_neuron2]
OR
[Wi_neuron1 Wi_neuron2 Wf_neuron1 Wf_neuron2 Wg_neuron1 Wg_neuron2 Wo_neuron1 Wo_neuron2]
In other words, are the gates grouped together for all neurons, or do the 4 gates repeat for all neurons?
0 Commenti
Risposte (2)
Sanjana
il 21 Ago 2023
Hi,
I understand that you are facing an issue in understanding the order of weights in an LSTM Layer with multiple units.As per the documentation, each LSTM unit is associated with “Input gate”, “Forget gate”, “Cell candidate” and “Output gate”.
The learnable weights mentioned in the documentation are with respect to the LSTM layer with multiple units, but not the individual LSTM unit. By checking the ” learnables” of the following example “dlnetwork” object with LSTM layer, we can interpret the learnable weights as explained in the documentation. The learnable weights are obtained by the concatenation of weight matrices associated with the “Input gate”, “Forget gate”, “Cell candidate” and “Output gate” of the individual LSTM cells in the LSTM layer.
inputSize = 1;
embeddingDimension = 100;
numWords = 2920
numClasses = numWords + 1;
layers = [
sequenceInputLayer(inputSize)
wordEmbeddingLayer(embeddingDimension,numWords)
lstmLayer(100)
dropoutLayer(0.2)
fullyConnectedLayer(numClasses)
softmaxLayer];
lgraph = layerGraph(layers);
dlnet = dlnetwork(lgraph);
Below is the above “dlnetwork” object “dlnet.Learnables”,
Please refer to the following documentation, for more information,
Hope this helps.
Regards,
Sanjana.
0 Commenti
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!