SVM gradient formula

In the video “SVM-Pgasos Algorithm for Unconstrained Optimization” we calculate the gradient of the loss function as
grad(loss)=w+(xi*y(i)) if 1-ti>0
grad(loss)=w if ti<0
but in the code in the link “https://github.com/arshagarwal/machine-learning-online-2018/blob/master/12.%20Support%20Vector%20Machines/SVM.ipynb
we are not adding w.

Is this an error or I am getting it wrong?

Can you point me at what time in the video it is said that

grad(loss)=w+(xi*y(i)) if 1-ti>0
grad(loss)=w if ti<0

?

At 17:50, in the final expression that has been highlighted.

Found it! it is added in the end

1 Like