Derivative of the Sigmoid Activation function

The sigmoid activation function is a widely used function in Deep Learning which essentially squashes values between zero and one. Its derivative has advantageous properties, which is why it is widely used in neural networks.

In this video, I’ll walk through each step of the derivation and discuss why people generally use the above mentioned form of the derivative instead of any other form.

To view the video

Click here to view the video.

Want to know more about me?

Follow Me