Text Analytics Toolbox Model for BERT-Small Network

Pretrained BERT-Small Network for MATLAB.

Al momento, stai seguendo questo contributo

BERT-Small is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 4 self-attention layers and a hidden size of 512.
To load a BERT-Small model, you can run the following code:
[net, tokenizer] = bert(Model="small");

Compatibilità della release di MATLAB

  • Compatibile con R2023b fino a R2026a

Compatibilità della piattaforma

  • Windows
  • macOS (Apple Silicon)
  • macOS (Intel)
  • Linux