Logistic Regression - Likelihood estimation and Loss

How did you combine the two probabilities equation P(y=1|x;theta) = h (theta) x and P(y=0|x;theta) = 1 - h(theta)x into a single equation for maximum likelihood? [Time 08:00 in video)

Hey jyoti,

Pyi x (1-P)(1-yi)

In this statement if yi =1
(1-P)(1-yi) This statement would be 0 and we are left with Pyi which is h (theta) x

If yi=0 Then
Pyi this statement would be 0, and we are left with (1-P)(1-yi) which is 1 - h(theta) x.
Therefore we combined both statements into one.

I hope this clears your doubt,
Thanks :slight_smile:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.