Professional Documents
Culture Documents
Hyperbolic Tangent Activation Function Integrated Circuit Implementation
Hyperbolic Tangent Activation Function Integrated Circuit Implementation
Hyperbolic Tangent Activation Function Integrated Circuit Implementation
• Flow chart
• Code
• Result
• Future work
• Reference
What is a Neural Network Activation Function?
• Advantages:
The tanh function is just another possible functions that can be used as a nonlinear activation function between layers of a
neural network. It actually shares a few things in common with the sigmoid activation function. They both look very similar.
But while a sigmoid function will map input values to be between 0 and 1, Tanh will map values to be between -1 and 1.
START
Input x
P= A+B Q=C+E
Out= P/Q
Upto 6th term :
sSTART
Input x
II = 90*B
E=270*F H=15*G
PP=A + I + C Q=D+E+H
Out= P/Q
TanH approximation
Upto 4th term : Upto 6th term :
RESULT
Reference:
• Activation Functions: Comparison of Trends in Practice and Research for Deep Learning Chigozie
Enyinna Nwankpa, Winifred Ijomah, Anthony Gachagan, and Stephen Marshall
• https://towardsdatascience.com/complete-guide-of-activation-functions-34076e95d044