Which one is the loss function?

In this video, of logistic regression, two functions are shown

one is concave on which gradient ascent is implemeted.
And the other is convex taken as negative of log likelihood equation.

In this two which one is the loss function?

what are these two equations actually representing? please clear my doubt

Hey @settingsingh, likelihood is something we aspire for, this means our model should have as maximum value as possible, so we need to maximize log likelihood. So gradiend ascend must be applied here.

On the other hand, negative of log likelihood, is treated as loss, and we need to minimize it. So gradient descend must be applied here.

Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section. :blush:

So basically, we can operate on any one and proceed with it for implementing our Logistic Regression model? am I right?

Hey @settingsingh, yes definitely.