How to implement a Transformer with separate encoder and decoder inputs and cross-attention in MATLAB?

10 visualizzazioni (ultimi 30 giorni)
Hello,
I was able to build Transformer models in encoder-only and decoder-only form by stacking layers. However, I would now like to implement a model similar to the original Transformer paper, which takes two separate inputs (encoder input and decoder input) and applies cross-attention between them.
The difficulties I am facing are:
  1. trainNetwork does not seem to support multiple inputs.
  2. MATLAB does not provide a built-in cross-attention layer.
  3. When I try to implement a custom cross-attention layer, I encounter too many errors during training.
Is there any recommended way or workaround to build and train a Transformer model in MATLAB that supports encoder-decoder input structure with cross-attention?
Thank you in advance for your help.

Risposte (1)

Matt J
Matt J il 20 Ago 2025
Modificato: Matt J il 20 Ago 2025
trainNetwork does not seem to support multiple inputs.
trainNetwork is deprecated. You should be using trainnet, which supports multiple inputs. From the doc,
"For neural networks with multiple inputs, you must use a TransformedDatastore or CombinedDatastore object."
MATLAB does not provide a built-in cross-attention layer.
Perhaps the example Create Cross-Attention Neural Network is relevant to you.

Categorie

Scopri di più su Image Data Workflows in Help Center e File Exchange

Prodotti


Release

R2025a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by