Tuning
To learn how to set options using the trainingOptions
function, see Set Up Parameters and Train Convolutional Neural Network. After you identify some good starting options, you can automate sweeping of hyperparameters or try Bayesian optimization using Experiment Manager.
Investigate network robustness by generating adversarial examples. You can then use fast gradient sign method (FGSM) adversarial training to train a network robust to adversarial perturbations.
App
Deep Network Designer | Progetta, visualizza e addestra le reti di Deep Learning |
Oggetti
trainingProgressMonitor | Monitor and plot training progress for deep learning custom training loops (Da R2022b) |
Funzioni
trainingOptions | Opzioni per l’addestramento della rete neurale di Deep Learning |
trainNetwork | Train neural network |
trainnet | Train deep learning neural network (Da R2023b) |
Argomenti
- Set Up Parameters and Train Convolutional Neural Network
Learn how to set up training parameters for a convolutional neural network.
- Deep Learning con ottimizzazione bayesiana
Questo esempio mostra come applicare l'ottimizzazione bayesiana al Deep Learning e trovare gli iperparametri ottimali di rete, nonché le opzioni di addestramento per le reti neurali convoluzionali.
- Detect Issues During Deep Neural Network Training
This example shows how to automatically detect issues while training a deep neural network.
- Train Deep Learning Networks in Parallel
This example shows how to run multiple deep learning experiments on your local machine.
- Train Network Using Custom Training Loop
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
- Compare Activation Layers
This example shows how to compare the accuracy of training networks with ReLU, leaky ReLU, ELU, and swish activation layers.
- Generate Experiment Using Deep Network Designer
Use Experiment Manager to tune the hyperparameters of a network trained in Deep Network Designer.
- Deep Learning Tips and Tricks
Learn how to improve the accuracy of deep learning networks.
- Specify Custom Weight Initialization Function
This example shows how to create a custom He weight initialization function for convolution layers followed by leaky ReLU layers.
- Compare Layer Weight Initializers
This example shows how to train deep learning networks with different weight initializers.
- Customize Output During Deep Learning Network Training
This example shows how to define an output function that runs at each iteration during training of deep learning neural networks.
- Create Custom Deep Learning Training Plot
This example shows how to create a custom training plot that updates at each iteration during training of deep learning neural networks using
trainnet
. (Da R2023b) - Custom Stopping Criteria for Deep Learning Training
This example shows how to stop training of deep learning neural networks based on custom stopping criteria using
trainnet
. (Da R2023b)