Deep Learning Toolbox Interface for LiteRT Library
Incorporate pretrained LiteRT (aka TFLite) models into MATLAB and Simulink applications for simulation and deployment to hardware.
665 download
Aggiornato
15 ott 2025
The Deep Learning Toolbox Interface for LiteRT Library enables you to run cosimulations of MATLAB and Simulink applications with LiteRT (aka TensorFlow Lite or TFLite) models. This workflow allows you to use pretrained LiteRT models, including classification and object detection networks, with the rest of the application code implemented in MATLAB or Simulink for development and testing.
Inference of pretrained LiteRT models is executed by the LiteRT Interpreter while the rest of the application code is executed by MATLAB or Simulink. Data exchange between MATLAB or Simulink and LiteRT is handled automatically.
When used with MATLAB Coder, you can generate C++ code for the complete application for deployment to target hardware. In the generated code, inference of the LiteRT model is executed by the LiteRT Interpreter while C++ code is generated for the remainder of the MATLAB or Simulink application, including pre- and post-processing. Data exchange between the generated code and the LiteRT Interpreter is again handled automatically.
If you need to generate code from the LiteRT models alongside the pre and postprocessing, you can use the MATLAB Coder Support Package for PyTorch and LiteRT Models.
Please see the following list for a list of prerequisites for using this software package:
If you experience download or installation problems, please contact Technical Support:
Compatibilità della release di MATLAB
Creato con
R2022a
Compatibile con R2022a fino a R2026a
Compatibilità della piattaforma
Windows macOS (Apple Silicon) macOS (Intel) LinuxTag
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Scopri Live Editor
Crea script con codice, output e testo formattato in un unico documento eseguibile.
