Linear regression implementation

why we are taking original x in hypothesis function and in other function like error ,gradient and gradientDecent we are taking the normalized value

Hey you are confusing the function parameters at the time of function definition with function calls.

final_theta, error_list,theta_list = gradientDescent(X,Y) // this is a function call to run gradient descent using Normalised data while the cell above this call is where we have defined the functions.

so
def hypothesis(x,theta)
return theta[0] + theta[1]*x

can also be defined as

def hypothesis(a,b):
return b[0] + b[1]*a