Error is increasing instead of decreasing

This is the code for linear regression on Boston house dataset:

Here, instead of decreasing with iterations the error is increasing. I have gone through the code a couple of times but can’t find the error. Moreover, I m getting the same results for both loop based method and vectorization method.

Hey @SanchitSayala, there is error in gradient function for both the codes (using for loops and vectorized implementation). Please check once.

Hope this helps.
All the best :slight_smile:

I found the error, the gradient function was completely fine. The problem was with the value I assumed for the learning rate. I assumed it to be 0.5 and after changing it to 0.1 the program worked fine.
Now, the learning rate may vary from problem to problem, so is there a particular way of guessing its somewhat good value for a given problem ??

@SanchitSayala , by changing the learning rate from 0.5 to 0.1 , your error plot cannot change from figure 1 to figure 2 shown below. There must be some other error in the code. Anyways if u think that your code is working fine then it’s great. Learning rate usually varies problem to problem. The commonly followed approach is to try 2-3 different learning rates and select the one which gives you the best loss without sacrificing speed of training.

image Figure 1

image Figure 2

Hope this helps.

I too thought the same that changing the learning rate should not increase the error, but the only change I have made in the program is the learning rate and as a result of it, the error plot changed.
To understand why this was happening, I tried taking a large value of learning rate in a linear regression problem with single feature and I found out that the while gradient[0], responsible for theta[0] was decreasing with iterations as expected, gradient[1], responsible for theta[1] was increasing. As a result the slope of our best line started moving away from the scatter points of training data, and that led to an increase in error.

Thanks for the help, anyways.

Ohh great @SanchitSayala, that was a good observation. I was not able to see that because I did not run the whole code. But yeah good.

Happy Learning :slight_smile: