In logistic regression in gradient function why we are not inserting one more loop of range(w.shape) and not including indexing to w

in logistic regression in gradient function why we are not inserting one more loop of range(w.shape) and not including indexing to w

Hey @hssharma2212000, the reason being that we are dealing with vectors here, or you can say every time we use w, grad_w etc complete vector is subtracted,added etc. like in line
w = w + lr*grad_w, here w contains all weights and they are sequentially updated automatically using this formula only. Its just the way of implementation.

Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section. :blush:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.