loadTFLiteModel
Syntax
Description
loads a pretrained TensorFlow™ Lite model file net
= loadTFLiteModel(modelFileName
)modelFileName
and returns a TFLiteModel
object.
Use this TFLiteModel
object with the predict
function
in your MATLAB® code to perform inference in MATLAB execution, code generation, or MATLAB Function block in
Simulink® models. For more information, see Prerequisites for Deep Learning with TensorFlow Lite Models.
To use this function, you must install the Deep Learning Toolbox Interface for TensorFlow Lite support package.
___ = loadTFLiteModel(___, PreserveDataFormats = false)
By default PreserveDataFormats
is set to false
. Set
PreserveDataFormats
explicitly only when you need to maintain and
produce the same data layout as the pretrained TensorFlow Lite model.
___ = loadTFLiteModel(___, PreserveDataFormats = true)
preserves the layouts of the input and output data to match the layout of the pretrained
TensorFlow Lite model during model construction. You must specify this option as a
compile-time constant for code generation workflow.
Examples
Input Arguments
Output Arguments
Extended Capabilities
Version History
Introduced in R2022a
See Also
Topics
- Deploy Pose Estimation Application Using TensorFlow Lite Model (TFLite) Model on Host and Raspberry Pi
- Generate Code for TensorFlow Lite (TFLite) Model and Deploy on Raspberry Pi
- Deploy Super Resolution Application That Uses TensorFlow Lite (TFLite) Model on Host and Raspberry Pi
- Prerequisites for Deep Learning with TensorFlow Lite Models