Main Content

Operations

Develop custom deep learning functions

For most tasks, you can use built-in layers. If there is not a built-in layer that you need for your task, then you can define your own custom layer. You can specify a custom loss function using a custom output layer and define custom layers with learnable and state parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients. To learn more, see Define Custom Deep Learning Layers. For a list of supported layers, see List of Deep Learning Layers.

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Use deep learning operations to develop MATLAB® code for custom layers, training loops, and model functions.

Funzioni

espandi tutto

dlarrayDeep learning array for customization (Da R2019b)
dimsEtichette della dimensione di dlarray (Da R2019b)
finddimFind dimensions with specified label (Da R2019b)
stripdimsRemove dlarray data format (Da R2019b)
extractdataEstrae i dati da dlarray (Da R2019b)
isdlarrayCheck if object is dlarray (Da R2020b)
dlconvDeep learning convolution (Da R2019b)
dltranspconvDeep learning transposed convolution (Da R2019b)
lstmLong short-term memory (Da R2019b)
gruGated recurrent unit (Da R2020a)
attentionDot-product attention (Da R2022b)
embedEmbed discrete data (Da R2020b)
fullyconnectSum all weighted input data and apply a bias (Da R2019b)
dlode45Deep learning solution of nonstiff ordinary differential equation (ODE) (Da R2021b)
batchnormNormalize data across all observations for each channel independently (Da R2019b)
crosschannelnormCross channel square-normalize using local responses (Da R2020a)
groupnormNormalize data across grouped subsets of channels for each observation independently (Da R2020b)
instancenormNormalize across each channel for each observation independently (Da R2021a)
layernormNormalize data across all channels for each observation independently (Da R2021a)
avgpoolPool data to average values over spatial dimensions (Da R2019b)
maxpoolPool data to maximum value (Da R2019b)
maxunpoolUnpool the output of a maximum pooling operation (Da R2019b)
reluApplicare l'attivazione dell'unità lineare rettificata (Da R2019b)
leakyreluApply leaky rectified linear unit activation (Da R2019b)
geluApply Gaussian error linear unit (GELU) activation (Da R2022b)
softmaxApply softmax activation to channel dimension (Da R2019b)
sigmoidApplica l’attivazione sigmoidea (Da R2019b)
crossentropyCross-entropy loss for classification tasks (Da R2019b)
l1lossL1 loss for regression tasks (Da R2021b)
l2lossL2 loss for regression tasks (Da R2021b)
huberHuber loss for regression tasks (Da R2021a)
mseHalf mean squared error (Da R2019b)
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification (Da R2021a)
dlaccelerateAccelerate deep learning function for custom training loops (Da R2021a)
AcceleratedFunctionAccelerated deep learning function (Da R2021a)
clearCacheClear accelerated deep learning function trace cache (Da R2021a)

Argomenti

Automatic Differentiation

Model Functions

Deep Learning Function Acceleration

Informazioni complementari