Custom Training Loops
trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that layer graphs do not support, you can define a custom model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.
|Deep learning network for custom training loops (Da R2019b)
|Reset state parameters of neural network
|Plot neural network architecture
|Add input layer to network (Da R2022b)
|Add layers to layer graph or network
|Remove layers from layer graph or network
|Connect layers in layer graph or network
|Disconnect layers in layer graph or network
|Replace layer in layer graph or network
|Print network summary (Da R2022b)
|Initialize learnable and state parameters of a
dlnetwork (Da R2021a)
|Deep learning network data layout for learnable parameter initialization (Da R2022b)
|Graph of network layers for deep learning
|Set L2 regularization factor of layer learnable parameter
|Get L2 regularization factor of layer learnable parameter
|Set learn rate factor of layer learnable parameter
|Get learn rate factor of layer learnable parameter
Custom Training Loops
|Compute deep learning network output for training (Da R2019b)
|Compute deep learning network output for inference (Da R2019b)
|Update parameters using adaptive moment estimation (Adam) (Da R2019b)
|Update parameters using root mean squared propagation (RMSProp) (Da R2019b)
|Update parameters using stochastic gradient descent with momentum (SGDM) (Da R2019b)
|Update parameters using limited-memory BFGS (L-BFGS) (Da R2023a)
|State of limited-memory BFGS (L-BFGS) solver (Da R2023a)
|Update parameters using custom function (Da R2019b)
|Monitor and plot training progress for deep learning custom training loops (Da R2022b)
|Update information values for custom training loops (Da R2022b)
|Record metric values for custom training loops (Da R2022b)
|Group metrics in training plot (Da R2022b)
|Pad or truncate sequence data to same length (Da R2021a)
|Create mini-batches for deep learning (Da R2020b)
|Encode data labels into one-hot vectors (Da R2020b)
|Decode probability vectors into class labels (Da R2020b)
|Obtain next mini-batch of data from minibatchqueue (Da R2020b)
|Reset minibatchqueue to start of data (Da R2020b)
|Shuffle data in minibatchqueue (Da R2020b)
|Determine if minibatchqueue can return mini-batch (Da R2020b)
|Partition minibatchqueue (Da R2020b)
|Deep learning array for customization (Da R2019b)
|Compute gradients for custom training loops using automatic differentiation (Da R2019b)
|Evaluate deep learning model for custom training loops (Da R2019b)
|Etichette della dimensione di
dlarray (Da R2019b)
|Find dimensions with specified label (Da R2019b)
dlarray data format (Da R2019b)
|Estrae i dati da
dlarray (Da R2019b)
|Check if object is
|Cross-entropy loss for classification tasks (Da R2019b)
|L1 loss for regression tasks (Da R2021b)
|L2 loss for regression tasks (Da R2021b)
|Huber loss for regression tasks (Da R2021a)
|Half mean squared error (Da R2019b)
|Connectionist temporal classification (CTC) loss for unaligned sequence classification (Da R2021a)
Custom Training Loops
- Train Deep Learning Model in MATLAB
Learn how to training deep learning models in MATLAB®.
- Define Custom Training Loops, Loss Functions, and Networks
Learn how to define and customize deep learning training loops, loss functions, and networks using automatic differentiation.
- Train Network Using Custom Training Loop
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
- Train Sequence Classification Network Using Custom Training Loop
This example shows how to train a network that classifies sequences with a custom learning rate schedule.
- Specify Training Options in Custom Training Loop
Learn how to specify common training options in a custom training loop.
- Define Model Loss Function for Custom Training Loop
Learn how to define a model loss function for a custom training loop.
- Update Batch Normalization Statistics in Custom Training Loop
This example shows how to update the network state in a custom training loop.
- Make Predictions Using dlnetwork Object
This example shows how to make predictions using a
dlnetworkobject by splitting data into mini-batches.
- Monitor Custom Training Loop Progress
Track and plot custom training loop progress.
- Multiple-Input and Multiple-Output Networks
Learn how to define and train deep learning networks with multiple inputs or multiple outputs.
- Train Network with Multiple Outputs
This example shows how to train a deep learning network with multiple outputs that predict both labels and angles of rotations of handwritten digits.
- Classify Videos Using Deep Learning with Custom Training Loop
This example shows how to create a network for video classification by combining a pretrained image classification model and a sequence classification network.
- Train Image Classification Network Robust to Adversarial Examples
This example shows how to train a neural network that is robust to adversarial examples using fast gradient sign method (FGSM) adversarial training.
- Train Robust Deep Learning Network with Jacobian Regularization
Train a neural network that is robust to adversarial examples using a Jacobian regularization scheme.
- Solve Ordinary Differential Equation Using Neural Network
This example shows how to solve an ordinary differential equation (ODE) using a neural network.
- Assemble Multiple-Output Network for Prediction
This example shows how to assemble a multiple output network for prediction.
- Train Network in Parallel with Custom Training Loop
This example shows how to set up a custom training loop to train a network in parallel.
- Run Custom Training Loops on a GPU and in Parallel
Speed up custom training loops by running on a GPU, in parallel using multiple GPUs, or on a cluster.
- Deep Learning Data Formats
Learn about deep learning data formats.
- List of Functions with dlarray Support
View the list of functions that support
- Automatic Differentiation Background
Learn how automatic differentiation works.
- Use Automatic Differentiation In Deep Learning Toolbox
How to use automatic differentiation in deep learning.
Deep Learning Function Acceleration
- Deep Learning Function Acceleration for Custom Training Loops
Accelerate model functions and model loss functions for custom training loops by caching and reusing traces.
- Accelerate Custom Training Loop Functions
This example shows how to accelerate deep learning custom training loop and prediction functions.
- Check Accelerated Deep Learning Function Outputs
This example shows how to check that the outputs of accelerated functions match the outputs of the underlying function.
- Evaluate Performance of Accelerated Deep Learning Function
This example shows how to evaluate the performance gains of using an accelerated function.