Main Content

Build Deep Neural Networks

Build networks for sequence and tabular data using MATLAB® code or interactively using Deep Network Designer

Create new deep networks for tasks such as classification, regression, and forecasting by defining the network architecture from scratch. Build networks using MATLAB or interactively using Deep Network Designer.

For most tasks, you can use built-in layers. If there is not a built-in layer that you need for your task, then you can define your own custom layer. You can specify a custom loss function using a custom output layer and define custom layers with learnable and state parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients. For a list of supported layers, see List of Deep Learning Layers.

For models that layer graphs do not support, you can define a custom model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.


Deep Network DesignerProgetta, visualizza e addestra le reti di Deep Learning


espandi tutto

Input Layers

sequenceInputLayerSequence input layer
featureInputLayerFeature input layer (Da R2020b)

Recurrent Layers

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN) (Da R2020a)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN) (Da R2022b)
gruProjectedLayerGated recurrent unit (GRU) projected layer for recurrent neural network (RNN) (Da R2023b)

Transformer Layers

selfAttentionLayerSelf-attention layer (Da R2023a)
positionEmbeddingLayerPosition embedding layer (Da R2023b)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (Da R2023b)
embeddingConcatenationLayerEmbedding concatenation layer (Da R2023b)
indexing1dLayer1-D indexing layer (Da R2023b)

Neural ODE Layers

neuralODELayerNeural ODE layer (Da R2023b)

Convolution, Attention, and Fully Connected Layers

convolution1dLayer1-D convolutional layer (Da R2021b)
transposedConv1dLayerTransposed 1-D convolution layer (Da R2022a)
selfAttentionLayerSelf-attention layer (Da R2023a)
fullyConnectedLayerFully connected layer

Activation and Dropout Layers

reluLayerLivello dell’unità lineare rettificata (ReLU)
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer (Da R2019a)
tanhLayerHyperbolic tangent (tanh) layer (Da R2019a)
swishLayerSwish layer (Da R2021a)
geluLayerGaussian error linear unit (GELU) layer (Da R2022b)
sigmoidLayerSigmoid layer (Da R2020b)
softmaxLayerLivello softmax
dropoutLayerDropout layer
functionLayerFunction layer (Da R2021b)

Normalization Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (Da R2020b)
instanceNormalizationLayerInstance normalization layer (Da R2021a)
layerNormalizationLayerLayer normalization layer (Da R2021a)
crossChannelNormalizationLayer Channel-wise local response normalization layer

Pooling Layers

maxPooling1dLayer1-D max pooling layer (Da R2021b)
averagePooling1dLayer1-D average pooling layer (Da R2021b)
globalMaxPooling1dLayer1-D global max pooling layer (Da R2021b)
globalAveragePooling1dLayer1-D global average pooling layer (Da R2021b)

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer (Da R2020b)
concatenationLayerConcatenation layer (Da R2019a)
depthConcatenationLayerDepth concatenation layer

Data Manipulation

sequenceFoldingLayerSequence folding layer (Da R2019a)
sequenceUnfoldingLayerSequence unfolding layer (Da R2019a)
flattenLayerFlatten layer (Da R2019a)

Output Layers

classificationLayerLivello di output della classificazione
regressionLayerLivello di output della regressione
layerGraphGraph of network layers for deep learning
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
replaceLayerReplace layer in layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
DAGNetworkDirected acyclic graph (DAG) network for deep learning
isequalCheck equality of deep learning layer graphs or networks (Da R2021a)
isequalnCheck equality of deep learning layer graphs or networks ignoring NaN values (Da R2021a)
analyzeNetworkAnalyze deep learning network architecture
dlnetworkDeep learning network for custom training loops (Da R2019b)
addInputLayerAdd input layer to network (Da R2022b)
summaryPrint network summary (Da R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Da R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Da R2022b)
checkLayerCheck validity of custom or function layer


Built-In Layers

Custom Layers