Why using Sigmoid function instead of tanh function is giving the results?

https://colab.research.google.com/drive/123k43p15IdrylOD-cfCAkwgKCboa37wj

This is a link to my neural network model. If I use tanh as activation function then it works fine, error decreases and prediction accuracy crosses 91% while if i use sigmoid as activation then its loss becomes constant from starting.

hey @nuts2021,
I changed the loss function, it was mean squared error loss, to binary cross-entropy loss. Now the loss is reducing.