"Relu" activation function and "Adam" optimizer for Time delay neural network

24 views (last 30 days)
I wanna design Time delay neural network, but I can't find the leaky rectified linear unit (Relu) activation function and "Adam" optimization algorithm in such type of networks (Time delay neural network).
Abdelwahab Afifi
Abdelwahab Afifi on 14 Jun 2020
I already add Deep Learning Toolbox but I don't know how to integrate the relu activation function and "Adam" optimizer within the "Time Delay neural network" structure.
As the time delay neural network is defined as follow
Net = timedelaynet(inputDelays,hiddenSizes,trainFcn)
where the trainFcn is limited to the folloeing types:
Bayesian Regularization
BFGS Quasi-Newton
Resilient Backpropagation
Scaled Conjugate Gradient
Conjugate Gradient with Powell/Beale Restarts
Fletcher-Powell Conjugate Gradient
Polak-Ribiére Conjugate Gradient
One Step Secant
Variable Learning Rate Gradient Descent
Gradient Descent with Momentum
Gradient Descent

Sign in to comment.

Answers (0)


Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by