coder.TensorRTConfig
Parameters to configure deep learning code generation with the NVIDIA TensorRT library
Description
The coder.TensorRTConfig object contains NVIDIA® high performance deep learning inference optimizer and run-time library
(TensorRT) specific parameters. codegen uses those parameters for generating
CUDA® code for deep neural networks.
To use a coder.TensorRTConfig object for code generation, assign it to
the DeepLearningConfig property of a coder.gpuConfig
object that you pass to codegen.
Creation
Create a TensorRT configuration object by using the coder.DeepLearningConfig function with target library set as
'tensorrt'.
Properties
Examples
Version History
Introduced in R2018bSee Also
Functions
codegen|imagePretrainedNetwork(Deep Learning Toolbox) |coder.DeepLearningConfig|coder.loadDeepLearningNetwork