LSTM SequenceLength and Batch Explained
19 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hi Matlab staff
For the current example https://www.mathworks.com/help/deeplearning/examples/time-series-forecasting-using-deep-learning.html
In this example, the sequencelength is by default longest. Does that mean that the sequencelength is 500 (I dont think it is! as this would make the algorithm much more computational extensive). Does that mean sequencelength is 1 then? It is unclear how the data is fed into the model. Further, a single batch is used (I assume! because changing minibatch size does not change the algorithm). Could you elaborate on the sequencelength and batchsize in this particular example ?.
I think this is very important for LSTM in Matlab community that there is full understanding of how the LSTM in Matlab is designed.
Cheers, MB
2 Commenti
John D'Errico
il 17 Mar 2019
We are not MATLAB staff, although some MathWorks employees may drop in here, on their free time.
Risposta accettata
Abhishek Singh
il 25 Mar 2019
Hi MB Sylvest,
According to the documentation provided you can know about the sequence length and batch size using trainingOptions.
When you try running the example and see the values of trainingOptions you’ll get to know that it took the default sequence length as ‘longest’ (which means it is 500) and batch size as 128.
I’ve added the screenshot for your reference:
You may also find these links to be useful:
- https://www.mathworks.com/help/deeplearning/ug/long-short-term-memory-networks.html
- https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html
0 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Image Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!