While executing logistic regression function

i get only one scalar value in error_list when i am calling gradient_descent function but it is a list, then why it is only 1 value??

Hey @lgoyal50_be19, you need to share your code link with me. Only then will I be able to debug your problem. Error_list should contain the errors of all the iterations your models is running.
Please share your code for me to be able to explain this better to you ! :+1:

Happy Learning ! :slightly_smiling_face:

def sigmoid(x):
return 1.0/(1.0 + np.exp(-x))

def hypothesis(X,theta):
return sigmoid(np.dot(X,theta))

def error(X,Y,theta):
hypo = hypothesis(X,theta)
error = -1 * np.mean((Y*np.log(hypo) + ((1-Y)*np.log(1-hypo))))
# Scalar value
return error

def gradient(X,Y,theta):
hypo = hypothesis(X,theta)
grad = -np.dot(X.T,(Y-hypo))
m = X.shape[0]
return grad/m

def gradient_descent(X,Y,lr=0.1,max_itr=500):
n = X.shape[1]
theta = np.zeros((n,1))
error_list = []
for i in range(max_itr):
e = error(X,Y,theta)
error_list.append(e)
grad = gradient(X,Y,theta)
theta = theta - lr*grad
return theta,error_list

Hey @lgoyal50_be19, this is not the way you are supposed to give me your code. Please upload it on google drive (.ipynb notebook) so that I can see the execution of cells and understand the logic of your code after seeing the proper indentation.

Thanks !

Hey @lgoyal50_be19, so please find the error in the screenshot attached below :

So have a look at gradient descent function. Just after 1 iteration, you are returning from the function. That’s why you are getting just 1 error point. To correct this please use the return statement after the for loop has ended…(outside the for loop) !

I hope this clears your doubt ! :+1:
Happy Learning ! :slight_smile: