Text Analytics Toolbox Model for BERT-Large Network
Pretrained BERT-Large Network for MATLAB
130 download
Aggiornato
15 ott 2025
BERT-Large is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 24 self-attention layers and a hidden size of 1024.
To load a BERT-Large model, you can run the following code:
[net, tokenizer] = bert(Model="large");
Compatibilità della release di MATLAB
Creato con
R2023b
Compatibile con R2023b fino a R2026a
Compatibilità della piattaforma
Windows macOS (Apple Silicon) macOS (Intel) LinuxTag
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Scopri Live Editor
Crea script con codice, output e testo formattato in un unico documento eseguibile.
