loadTFLiteModel
Description
Add-On Required: This feature requires the Deep Learning Toolbox Interface for LiteRT Library add-on.
loads a pretrained TensorFlow™ Lite model file net = loadTFLiteModel(modelFileName)modelFileName and returns a TFLiteModel
object.
Use this TFLiteModel object with the predict function
in your MATLAB® code to perform inference in MATLAB execution, code generation, or MATLAB Function block in
Simulink® models. For more information, see Prerequisites for Deep Learning with TensorFlow Lite Models.
___ = loadTFLiteModel(___,
loads a pretrained TensorFlow Lite model file with specified options using one or more name-value arguments
in addition to the input arguments in previous syntaxes. For example, when you need to
maintain and produce the same data layout as the pretrained TFLite model, set
Name=Value)PreserveDataFormats to true.
Examples
Input Arguments
Name-Value Arguments
Output Arguments
Extended Capabilities
Version History
Introduced in R2022aSee Also
Topics
- Deploy Pose Estimation Application Using TensorFlow Lite Model (TFLite) Model on Host and Raspberry Pi
- Generate Code for TensorFlow Lite (TFLite) Model and Deploy on Raspberry Pi
- Deploy Super Resolution Application That Uses TensorFlow Lite (TFLite) Model on Host and Raspberry Pi
- Prerequisites for Deep Learning with TensorFlow Lite Models