A complete guide to Activation Functions used in Neural Networks
Artificial Intelligence (AI) is one of the most trending industries in 2018. AI is changing our world
forever. If you know about AI then you must have heard about Neural Networks. Neural
Networks is one of the most used and popular algorithms in AI.
In this article, I will talk about Activation Functions used in Neural Networks. Activation
Functions are very important in Neural Networks.
So, How Neural Networks work?
Artificial Neural Networks (ANN) are roughly based on our brains Neural Network. In ANNs,
multiple nodes are interconnected together to signals can pass through these nodes. Because
of these interconnected nodes, ANNs gives us amazing results.
To understand how NNs are, first, assume a 2 layer NN. That means an NN with one input
layer, 1 hidden layer, and one output layer.
Note — We don’t count the input layer.
First, we’ve some input as a Vector and, we feed that vector to the network. Then the network
performs matrix operations on that input vector to calculates the “weighted sum” of its input, add
bias and then finally apply some Activation Function and pass the value to the next layer. And
we keep repeating this process until we reach the last layer. This process is known as Forward
In the simplest form, to calculate the “weighted sum”, we use the following equation
The final out value is the prediction. And we use this prediction to calculate the error by
comparing the output with the label. We use the error value to calculate the partial derivative
w.r.t the weights and then update the weights with value. We keep repeating this process until
the error becomes very small. This process is known as Backward Propagation
This is how a Neural Network works. So now we understand about Neural Networks, so we can
jump in the Activation Function.
Note — I’m not going to deep about Forward Propagation and Back Propagation. If you don’t
have any idea about Forward Propagation and Back Propagation, then please learn these
topics first and then follow this post.