Theta answer differs by a factor of 10

when i did the code side by side with prateek sirs video it worked but now there is difference in the value of theta i dont know why

Hey @S19LPN0163, check your learning rate and number of epochs and see if its same.

If I set the learning rate to 1 then I am getting close to the answer the number of epochs is not affecting the value of theta where as increase in learning decreases theta by the same factor

Hey @S19LPN0163, okay than use learning rate =1

But that should not change no for the same dataset because when I did it by watching the video I used the learning rate as the video and got the answer but know when the code is almost the same I am unable to get the answer. What I want to know is why am I getting different answer when the code is almost same as the video. I want to understand where I am going wrong due to which I need to alter the learning rate in order to get the answer

Hey @S19LPN0163, if you have not used minibatch etc, than the answer should not differ. The best way to check is try running the same code from github, cb.lk/ml18 and check what it produces, if it produces same as in video, than there is somewhere error in your code.

Ya ml18 gives the answer same as in video and I even tried to find the error in my code but I couldn’t find it can you please look into it once
Thank you

Hey @S19LPN0163, this suggests that there is very minute error, in your code, its not feasible at my end to do that part. You need to do that yourself, try replacing function by function from sir’s code. You need to somehow manage this yourself.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.