Is there GPU support for the MatLab GitHub BERT Model?
15 views (last 30 days)
Bijan Sahamie on 21 Feb 2022
MatLab does not seem to natively support models like BERT, but there is a Gihub repository where pre-trained BERT models can be loaded.
However, to me this seems a little "workaorundy" and totally side-stepped from the standard architecture and workflow that the deep learning toolbox brings to MatLab. As painful as this is (for now I can live with this), my main problem the following:
I was not able to figure out how to use that code --for instance, using the pretrained BERT or FinBert-- with my GPU (GPU works, MatLab finds it, etc...). Inferencing on a relatively small dataset takes ages (>25 mins) compared to ~3 mins with GPU using a similar model and identical dataset in Tensorflow.
Help would be much appreciated. Thanks.
David Willingham on 24 Feb 2022
Did walter's comment help speed up your training? What version of MATLAB are you using?
On your comments regarding the Transformer models implementation. We currently don't have the inbuilt layers to support transformers, however the flexibility of the framework allows for users to create their own model functions when inbuilt layers don't exist. For more information on this, see this page Train Deep Learning in MATLAB. As you've pointed out though, this implementation does require more work to achieve the same benefits as compared to have full layer support. For reference we are actively looking at supporting more layers for Transformers in a future release.