my algorithm is not minimizing the loss the graph is going upwards instead downwards
link to algo:https://ide.codingblocks.com/s/162545
can you tell me the problem…
learning rate is 0.01
max_itr = 200
w = np.zeros((X_train.shape[1],))
b = 0.0
Logistic regression
Your code is completely correct with only just a minor mistake in it.
Just remove -1 from line 19 and 20,
grad_w += (y_true[i]-hx)*x[i]
grad_b += (y_true[i]-hx)
Hope this cleared your doubt.