Hardwork pays OFF

Theta values are not updating

code
def hypothesis(X,theta):
return theta[0]+theta[1]X
def error(X,y,theta):
m= X.shape[0]
error = 0
for i in range (m):
Hx= hypothesis(X[i],theta)
error += ((Hx-y[i])**2)
return error
def gradient(theta,X,y):
grad = np.zeros((2,))
m = X.shape[0]
for i in range (m):
Hx = hypothesis(X[i],theta)
grad[0] += Hx - y[i]
grad[1] += (Hx- y[i])X[i]
return grad
def GradientDescent(X,y,learning_rate = 0.001):
#theta = np.zeros((2,))
theta= np.array([-2.0,1.0])
itr = 0
error_list=[]
theta_list =[]
max_itr = 100
while (itr <=max_itr):
grad = gradient(theta,X,y)
e = error(X,y,theta)
error_list.append(e)
theta_list.append((theta[0],theta[1]))
theta[0] =theta[0]-learning_rate
grad[0]
theta[1]= theta[1]-learning_rate
grad[1]
itr+=1
return theta,error_list,theta_list
final_theta, error_list,theta_list = GradientDescent(X,y)

Hi @dtele,
It’s very difficult to understand the code like this. I would suggest you to use cb.lk/ide to share your code.
Thanks :slight_smile:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.

I have uploaded my code on respective Id please do check