How to implement a Transformer with separate encoder and decoder inputs and cross-attention in MATLAB?
10 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hello,
I was able to build Transformer models in encoder-only and decoder-only form by stacking layers. However, I would now like to implement a model similar to the original Transformer paper, which takes two separate inputs (encoder input and decoder input) and applies cross-attention between them.
The difficulties I am facing are:
- trainNetwork does not seem to support multiple inputs.
- MATLAB does not provide a built-in cross-attention layer.
- When I try to implement a custom cross-attention layer, I encounter too many errors during training.
Is there any recommended way or workaround to build and train a Transformer model in MATLAB that supports encoder-decoder input structure with cross-attention?
Thank you in advance for your help.
0 Commenti
Risposte (1)
Matt J
il 20 Ago 2025
Modificato: Matt J
il 20 Ago 2025
trainNetwork does not seem to support multiple inputs.
trainNetwork is deprecated. You should be using trainnet, which supports multiple inputs. From the doc,
"For neural networks with multiple inputs, you must use a TransformedDatastore or CombinedDatastore object."
MATLAB does not provide a built-in cross-attention layer.
0 Commenti
Vedere anche
Categorie
Scopri di più su Image Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!