Some errors in the code

I implemented and exactly replicated the code as shown in the video,but still i am facing some error:
1-although i have replicated the code but still when i implement the summary function of the class i get an error that says, class NeuralNetwork have no attribute named activation_outputs, i understand the error and even tried it but it worked correctly one time and showed error when i re-implemented the code , don’t know why?
2- when we derived the derivation for the formulas of weight updates and bias updates , in the formula we took the mean i divided np.sum (—) by 1/m also to take the average , infact in the last video prateek bhaiya also did the same but in this video when i trained the data for and plotted the loss i gave wrong results for the bias update and when i removed the 1/float(m) from the bias update ie db , i got correct output , whats the reason for this?
CODE LINK:-https://drive.google.com/file/d/1AAeXzz5gMTWjM2Or2CvhVv1A8SRQaIJC/view?usp=sharing

Hey @saksham_thukral, its not feasible for us to check your code line by line, better run the code directly present on github, cb.lk/ml18. And check if it also shows the error.

  1. If you are adding 1/float(m), add it to both biases and weights, and if not than remove it from both. And try to train the model again in both these cases and see the results.

Hope this resolved your doubt.
Don’t forget to mark the doubt as resolved :blush:

But this code is not present on github, as i cloned the repository and checked in neural network folder, but it isn’t present there.If it is present then please if you can send me the code link?

Hey @saksham_thukral, here is the link https://github.com/prateek27/deep-neural-network

got the error , thanx for the help :relaxed:

1 Like

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.