Loss function minimisation in weighted linear regression

In Closed form solution of weighted regression we are minimising the loss function…it says that if the point is near to the query point then wi will increase that is its value will be closer to 1. Now loss function will increase in this case because wi is increasing hence the product will increase. Is the above statement contrary?

Hi @Pranav-Gupta-1716510788408154,

The above statements aren’t contradictory. First of all, The weights of just the points closer to the query are higher and simultaneously, weights for farther away points are close to 0 (See the balancing effect).
Also, this is just the mathematical way of representing that closer points have a greater say in deciding the Hypothesis function.

Secondly, In this solution, weights W are not the parameters we will use to minimize the loss, they are more like scaling constants (for a query point).
The actual parameters we will use to minimize the loss is the Theta (θ) vector.
That’s why we are differentiating w.r.t to θ. We want the optimum value of θ which minimizes the Loss (not W which remains unchanged).

Hope this resolved your doubt!