Neural Network - Activation Functions

def relu(z):
return np.maximum(0, z)

def reluDerivative(x):
return np.heaviside(x, 0)

Instead of np.tanh(z) , I’m trying to use Relu as the activation function while building neural network from scratch but the above code for Relu is not running properly. What is the error in it ?

hey @gautam75,
can you please share your full code and let me know what problem are you facing in this.

Thank You

Here is the link,
https://colab.research.google.com/drive/10taCh6RTw4ObqE8Yj4oYSt61tKcXsE23?usp=sharing

During training of the model using train function, 3 runtime warnings are there and majority of the loss values are Nan type.

hey @gautam75 ,
can you please share me those files ( dataset ) you used.
The one i have with same names , is different.

Thank You

These files were given in Neural Network Challenge - Classify Points.
https://drive.google.com/drive/folders/1I6Cjj8dsQN7iDI0mSgfcUdGVnmy-dOht?usp=sharing

The same file names were used in Logistic Regression Challenge as well but the data was different in that with 3 features I guess.

hey @gautam75 ,
i thoroughly checked your code and did gave it runs too.
there is no such problem from your end , actually the problem is due to the values are sometimes getting very small or very big and due to that you might see some warnings of overflow or divide by zero something like this.
Due to these , you values of weights and bias are turned into NaN values.
and hence that cause breaking your model working.

To deal with , you will need to add more layers or add regularization.
That might help.
Else , your relu codes are correct and you can move forward with other learning.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.