Text Analytics Toolbox Model for BERT-Base Multilingual Cased Network
Pretrained BERT-Base Multilingual Cased Network for MATLAB
74 download
Aggiornato
11 dic 2024
BERT-Base Multilingual Cased is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 12 self-attention layers and a hidden size of 768.
To load a BERT-Multilingual Cased model, you can run the following code:
[net, tokenizer] = bert(Model="multilingual");
Compatibilità della release di MATLAB
Creato con
R2023b
Compatibile con R2023b fino a R2025a
Compatibilità della piattaforma
Windows macOS (Apple silicon) macOS (Intel) LinuxTag
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Scopri Live Editor
Crea script con codice, output e testo formattato in un unico documento eseguibile.