Activation functions are used in neural networks in order to map the input and output in a more perfect manner and most of the time(about 99% time) we use non-linear activation function rather than linear activation function because a non-linear graph can do much better job at mapping than the linear graph.

Source

SIGMOID:

Sigmoid activation function also called as squashing function that maps any input value in the range (0,1) so they can be interpreted as probability and used in the final layer so as to make easier to interpret the results. Sigmoid functions are also an important part of a…


Most of the times our intuition with a real life problems goes along with the mathematical approach to the problem but sometimes when the mathematical approach says otherwise it sounds so absurd that the human mind never wants to believe the approach and this is what happened when Marilyn vos savant an american columnist's answer to a question outraged lot including PhD's including Paul erdos who is referred as Father of Discrete Mathematics.

Question:

“”Suppose you’re on a game show, and you’re given the choice of three doors. Behind one door is a car, behind the others, goats. You pick a…

KS Haarish Dharan

Currently Pursuing my undergraduate from NIT-Trichy in instrumentation and control engineering and I love to write about stuffs i read and about stuffs i learn.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store