Likelihood estimation in Logistic regression

Suppose, there are 6 training (x1…x6) points, and hypothesis value comes out to be as follows -
Remember hθ(x) = sigmoid(wT.x + b), where w and b are weights and bias learned by the model after training.
hθ(x1) = 0.6
hθ(x2) = 0.4
hθ(x3) = 0.7
hθ(x4) = 0.1
hθ(x5) = 0.4
hθ(x6) = 0.9
The maximum likelihood estimation is defined as the product of the probabilities p(yi|xi). Assuming 100% correct classification, the likelihood value(without log) for the above case would be -

As you mentioned, the likelihood is product of Bernoulli distribution. So, in your case, assuming 100% correct classification and you are keeping 0.5 as the threshold and hθ(x) is the probability of label being 1 then,

The likelihood = 0.6 × (1 - 0.4) × 0.7 × (1 - 0.1) × (1 - 0.4) × 0.9 = 0.122472

Note:- remember, likelihood = Py × (1 - P)(1 - y)

Happy Learning,
Thanks :slight_smile:

1 Like

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.

Could you explain the right answers for logistic regression quiz
problem 6,3 and 1

Hello @Management718,

Could you please open a different thread for the same.
Thanks :slight_smile: