How to select Activation Function?

Sir, I tried 2 layers and 3layers and different number of neurons in each layer with relu and sigmoid. I tried everything, and my accuracy is always in range (70-88)%.

Then I finally tried 'tanh 'as activation function. I got 96% accuracy in first go.

My doubt is, how do we know the best activation function for hidden layers depending on the dataset.

Hi @harshsharmajnv_9b70d236614796d5
I have shared an explaination with you via chat please have a look once

Hope that might help :slightly_smiling_face:

okay sir, thanks you

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.