My loss curve is not coming out to be smooth. If you can please take a look at my code. I’ve been stuck with this for 2 hours
Not getting smooth curve after training
Hey @raunaqsingh10, we are there to help you don’t worry, just share the link of your code by uploading it on google drive.
Hey @raunaqsingh10, i just removed the /float(m) from 3 positions and than your code ran completely fine.
def backward(self,x,y,learning_rate=0.002):
# y is the hot vector
w1,w2,w3 = self.model['w1'],self.model['w2'],self.model['w3']
b1,b2,b3 = self.model['b1'],self.model['b2'],self.model['b3']
m = x.shape[0]
a1,a2,y_ = self.activation_outputs
delta_3 = y_ - y # size = m,l3
dw3 = np.dot(a2.T,delta_3)
db3 = np.sum(delta_3,axis=0)
delta_2 = np.dot(delta_3,w3.T)*(1-np.square(a2))
dw2 = np.dot(a1.T,delta_2)
db2 = np.sum(delta_2,axis=0)
delta_1 = np.dot(delta_2,w2.T)*(1-np.square(a1))
dw1 = np.dot(x.T,delta_1)
db1 = np.sum(delta_1,axis=0)
# Update the model parameters using gradient descent
self.model['w1'] -= learning_rate*dw1
self.model['w2'] -= learning_rate*dw2
self.model['w3'] -= learning_rate*dw3
self.model['b1'] -= learning_rate*db1
self.model['b2'] -= learning_rate*db2
self.model['b3'] -= learning_rate*db3
Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section.