One V/s Rest : Class Imbalance Problem?

Hi ,

In One V/s Rest approach don’t we get trap in the problem of class imbalance ? In the given example we know that we have 4 class & assume that size of each class will be “S” . now the number of data point in one Vs Rest for i=one class as S & rest will be 3S .

So ultimately we are making our model to be biased towards rest class .

So how to resolve this problem ?

Please comment on this .

Basically when we use classification techniques like support vector machines, they are not sensitive to imbalanced classification. So yes there would be imbalanced classes, but it will affect testing accuracies in the case our model is sensitive to imbalanced classification, otherwise it will not be. That’s the reason we use techniques like naive bayes classifier, svm etc.

Hope this cleared your doubt.

Can you please elaborate more ? it will affect testing accuracies in the case our model is sensitive to imbalanced classification ? this part is not clear . Detailed explanation why One V/s rest do not suffer from class imbalance problem ?

Svm like techniques are insensitive to imbalanced that which means if one class has 20 examples and other class has 80 examples than also our model will be able to classify testing data accurately which means when a new test image is passed our model will be able to predict correctly. And this is the reason we will be able to do classification using one vs rest although it will create imbalanced data.

Hope this cleared your doubt. :smile:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.