Sincronizzazione del Deep Learning
Per apprendere come impostare le opzioni utilizzando la funzione trainingOptions
, vedere Set Up Parameters and Train Convolutional Neural Network. Dopo aver individuato alcune buone opzioni di partenza, è possibile automatizzare la spaziatura degli iperparametri o provare l'ottimizzazione bayesiana usando Experiment Manager.
Esaminare la solidità della rete generando esempi avversari. Si può quindi utilizzare l’addestramento avversario fast gradient sign method (FGSM) per addestrare una rete robusta alle perturbazioni avversarie.
App
Deep Network Designer | Progetta, visualizza e addestra le reti di Deep Learning |
Funzioni
trainingOptions | Options for training deep learning neural network |
trainNetwork | Train deep learning neural network |
Argomenti
- Set Up Parameters and Train Convolutional Neural Network
Learn how to set up training parameters for a convolutional neural network.
- Deep Learning Using Bayesian Optimization
This example shows how to apply Bayesian optimization to deep learning and find optimal network hyperparameters and training options for convolutional neural networks.
- Train Deep Learning Networks in Parallel
This example shows how to run multiple deep learning experiments on your local machine.
- Train Network Using Custom Training Loop
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
- Compare Activation Layers
This example shows how to compare the accuracy of training networks with ReLU, leaky ReLU, ELU, and swish activation layers.
- Generate Experiment Using Deep Network Designer
Use Experiment Manager to tune the hyperparameters of a network trained in Deep Network Designer.
- Deep Learning Tips and Tricks
Learn how to improve the accuracy of deep learning networks.
- Train Robust Deep Learning Network with Jacobian Regularization
This example shows how to train a neural network that is robust to adversarial examples using a Jacobian regularization scheme [1].
- Specify Custom Weight Initialization Function
This example shows how to create a custom He weight initialization function for convolution layers followed by leaky ReLU layers.
- Compare Layer Weight Initializers
This example shows how to train deep learning networks with different weight initializers.