Main Content

Build Deep Neural Networks

Build neural networks for image data using MATLAB® code or interactively using Deep Network Designer

Create new deep networks for tasks such as image classification and regression by defining the network architecture from scratch. Build networks using MATLAB or interactively using Deep Network Designer.

For most tasks, you can use built-in layers. If there is not a built-in layer that you need for your task, then you can define your own custom layer. You can specify a custom loss function using a custom output layer and define custom layers with learnable and state parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients. For a list of supported layers, see List of Deep Learning Layers.

For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

App

Deep Network DesignerProgetta, visualizza e addestra le reti di Deep Learning

Funzioni

espandi tutto

Input Layers

imageInputLayerImage input layer
image3dInputLayer3-D image input layer

Convolution and Fully Connected Layers

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer
groupedConvolution2dLayer2-D grouped convolutional layer
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer
fullyConnectedLayerFully connected layer

Transformer Layers

selfAttentionLayerSelf-attention layer (Da R2023a)
attentionLayerDot-product attention layer (Da R2024a)
positionEmbeddingLayerPosition embedding layer (Da R2023b)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (Da R2023b)
embeddingConcatenationLayerEmbedding concatenation layer (Da R2023b)
indexing1dLayer1-D indexing layer (Da R2023b)

Neural ODE Layers

neuralODELayerNeural ODE layer (Da R2023b)

Activation Layers

reluLayerLivello dell’unità lineare rettificata (ReLU)
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayerHyperbolic tangent (tanh) layer
swishLayerSwish layer (Da R2021a)
geluLayerGaussian error linear unit (GELU) layer (Da R2022b)
sigmoidLayerSigmoid layer (Da R2020b)
softmaxLayerLivello softmax
functionLayerFunction layer (Da R2021b)

Normalization Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (Da R2020b)
instanceNormalizationLayerInstance normalization layer (Da R2021a)
layerNormalizationLayerLayer normalization layer (Da R2021a)
crossChannelNormalizationLayer Channel-wise local response normalization layer

Utility Layers

dropoutLayerDropout layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer (Da R2019b)

Pooling and Unpooling Layers

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer
globalAveragePooling2dLayer2-D global average pooling layer (Da R2019b)
globalAveragePooling3dLayer3-D global average pooling layer (Da R2019b)
globalMaxPooling2dLayerGlobal max pooling layer (Da R2020a)
globalMaxPooling3dLayer3-D global max pooling layer (Da R2020a)
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer
maxUnpooling2dLayerMax unpooling layer

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer (Da R2020b)
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer
dlnetworkDeep learning neural network (Da R2019b)
imagePretrainedNetworkPretrained neural network for images (Da R2024a)
resnetNetwork2-D residual neural network (Da R2024a)
resnet3dNetwork3-D residual neural network (Da R2024a)
addLayersAdd layers to neural network
removeLayersRemove layers from neural network
replaceLayerReplace layer in neural network
connectLayersConnect layers in neural network
disconnectLayersDisconnect layers in neural network
addInputLayerAdd input layer to network (Da R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Da R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Da R2022b)
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter
plotPlot neural network architecture
summaryPrint network summary (Da R2022b)
analyzeNetworkAnalyze deep learning network architecture
checkLayerCheck validity of custom or function layer
isequalCheck equality of neural networks (Da R2021a)
isequalnCheck equality of neural networks ignoring NaN values (Da R2021a)

Argomenti

Built-In Layers

Custom Layers