Ottimizzazione
Per apprendere come impostare le opzioni utilizzando la funzione trainingOptions
, vedere Set Up Parameters and Train Convolutional Neural Network. Dopo aver individuato alcune buone opzioni di partenza, è possibile automatizzare la spaziatura degli iperparametri o provare l'ottimizzazione bayesiana usando Experiment Manager.
Esaminare la solidità della rete generando esempi avversari. Si può quindi utilizzare l’addestramento avversario fast gradient sign method (FGSM) per addestrare una rete robusta alle perturbazioni avversarie.
App
Deep Network Designer | Progettare e visualizzare reti di Deep Learning |
Oggetti
trainingProgressMonitor | Monitor and plot training progress for deep learning custom training loops (Da R2022b) |
Funzioni
trainingOptions | Opzioni per l’addestramento della rete neurale di Deep Learning |
trainnet | Train deep learning neural network (Da R2023b) |
Argomenti
- Set Up Parameters and Train Convolutional Neural Network
Learn how to set up training parameters for a convolutional neural network.
- Deep Learning con ottimizzazione bayesiana
Questo esempio mostra come applicare l'ottimizzazione bayesiana al Deep Learning e trovare gli iperparametri ottimali di rete, nonché le opzioni di addestramento per le reti neurali convoluzionali.
- Detect Issues During Deep Neural Network Training
This example shows how to automatically detect issues while training a deep neural network.
- Train Deep Learning Networks in Parallel
This example shows how to run multiple deep learning experiments on your local machine.
- Train Network Using Custom Training Loop
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
- Compare Activation Layers
This example shows how to compare the accuracy of training networks with ReLU, leaky ReLU, ELU, and swish activation layers.
- Deep Learning Tips and Tricks
Learn how to improve the accuracy of deep learning networks.
- Speed Up Deep Neural Network Training
Learn how to accelerate deep neural network training.
- Specify Custom Weight Initialization Function
This example shows how to create a custom He weight initialization function for convolution layers followed by leaky ReLU layers.
- Compare Layer Weight Initializers
This example shows how to train deep learning networks with different weight initializers.
- Create Custom Deep Learning Training Plot
This example shows how to create a custom training plot that updates at each iteration during training of deep learning neural networks using
trainnet
. (Da R2023b) - Custom Stopping Criteria for Deep Learning Training
This example shows how to stop training of deep learning neural networks based on custom stopping criteria using
trainnet
. (Da R2023b)