How do I export a Neural Network from MATLAB?
258 views (last 30 days)
Show older comments
MathWorks Support Team
on 15 Feb 2017
Edited: MathWorks Support Team
on 3 Sep 2021
I have a Neural Network which I trained using MATLAB. I want to export the network so I can use it with other frameworks, for example caffe. How do I do that?
Accepted Answer
MathWorks Support Team
on 3 Sep 2021
Edited: MathWorks Support Team
on 3 Sep 2021
The recently released Neural Network Toolbox Converter for ONNX Model Format now allows one to export a trained Neural Network Toolbox™ deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks, such as TensorFlow®, that support ONNX model import.
Alternatively, you could export via the MATLAB Compiler SDK.
Using the MATLAB Compiler SDK, you can save the trained network as a MAT file, and write a MATLAB function that loads the network from the file, performs the desired computation, and returns the Network's output.
You can then compile your MATLAB function into a shared library to be used in your C/C++, .NET, Java, or Python project.
You can find more information about MATLAB Compiler SDK in the following link:
Furthermore, the objects that MATLAB uses to represent Neural Networks are transparent, and you can therefore access all the information that describes your trained network.
For example, you will get an object of type SeriesNetwork, which is a trained Convolutional Neural Network. You can then see the weights and biases of the trained network:
convnet.Layers(2).Weights
convnet.Layers(2).Bias
Then, using for example caffe's MATLAB interface, you should be able to save a Convolutional Neural Network as a caffe model. The code for the MATLAB interface is in the following link:
and includes a classification demo that shows you how to use the interface.
Please note that the above code is not developed or supported by MathWorks Technical Support. If you have any questions about how to use the code, please contact the project's developers.
1 Comment
More Answers (2)
Maria Duarte Rosa
on 25 Jun 2018
Edited: MathWorks Support Team
on 20 Aug 2021
The recently released https://www.mathworks.com/matlabcentral/fileexchange/67296-deep-learning-toolbox-converter-for-onnx-model-format now allows one to export a trained Neural Network Toolbox™ deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks, such as TensorFlow®, that support ONNX model import.
1 Comment
michael scheinfeild
on 6 Aug 2018
still i have no success to import it to c++ from onnx there are many issues of compilation
michael scheinfeild
on 14 Apr 2019
after testing onnx i found that the output of convolutions is not the same as in matlab .
3 Comments
Vasil Ppov
on 20 Jul 2020
Yes, the output size is not the same! I'm trying to export ONNX YOLO model from Matlab to Python...succesfully! The expected output tensor should have size 14x14x12, however the size in Python is 2x6x196 (2x6=12 and 14*14=196). Could you tell me why and how to fix it ?
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!