could you anyone help me how to include sine, cosine and tanh activation function for training the neural network
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
In my code i have written the
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1)
reLuLayer
fullyConnectedLayer(numHiddenUnits2)
reLuLayer
fullyConnectedLayer(numClasses)
reLuLayer
regressionLayer]
Now i want to execute the code using sine, cosine and tanh instead of reLu.
Could anyone please help me on this.
0 Commenti
Risposte (1)
Akshat
il 27 Ago 2024
Hi Jaah,
I see you want to use different activation functions instead of reLu.
In the case of "tanh", you can use the "tanh layer", about which you understand here:
Using "sine" and "cosine" as activation functions is not a viable choice, as "sine" and "cosine" are periodic functions and they have many local extrema. Thus, we lose the uniqueness of values. Due to this reason, it is not a popular choice to use these functions as the activation functions.
Hope this helps!
Akshat
0 Commenti
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!