MATLAB to OpenVINO (Intel-Inteference)

Versione 1.0.0 (1,84 MB) da Kevin Chng
Deploy and optimise your trained model to Intel Processor
223 download
Aggiornato 18 feb 2019

Visualizza la licenza

Overview :

If you train your deep learning network in MATLAB, you may use OpenVINO to accelerate your solutions in Intel®-based accelerators (CPUs, GPUs, FPGAs, and VPUs) . However, this script don't compare OpenVINO and MATLAB's deployment Option (MATLAB Coder, HDL coder), instead, it will only give you the rough idea how to complete it (MATLAB>OpenVINO) in technical perspective.

Refers to the the link below to understand OpenVINO:
https://software.intel.com/en-us/openvino-toolkit

Highlights :
Deep Learning and Prediction
How to export deep learning model to ONNX format
How to deploy a simple classification application in OpenvinoR4 (Third-party software)

Product Focus :
MATLAB
Deep Learning Toolbox
Openvino R4 (Third-party Software)

Written at 28 January 2018

Cita come

Kevin Chng (2025). MATLAB to OpenVINO (Intel-Inteference) (https://www.mathworks.com/matlabcentral/fileexchange/70330-matlab-to-openvino-intel-inteference), MATLAB Central File Exchange. Recuperato .

Compatibilità della release di MATLAB
Creato con R2018b
Compatibile con qualsiasi release
Compatibilità della piattaforma
Windows macOS Linux
Categorie
Scopri di più su Deep Learning Toolbox in Help Center e MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Versione Pubblicato Note della release
1.0.0