Mini batch gradient descent

Does scikit learn use gradient descent method to train the regression model?
And mini-batch gradient descent is better than scikit learn?

Hey @aakritiaggarwal

By default, yes. Sklearn defaults to batch instead of mini-batch gradient descent. The wordaround is that many Models in sklearn provide the partial_fit() method to work with self-generated batches.

from sklearn.linear_model import SGDClassifier
import random
clf2 = SGDClassifier(loss='log') # shuffle=True is useless here
shuffledRange = range(len(X))
n_iter = 5
for n in range(n_iter):
    random.shuffle(shuffledRange)
    shuffledX = [X[i] for i in shuffledRange]
    shuffledY = [Y[i] for i in shuffledRange]
    for batch in batches(range(len(shuffledX)), 10000):
        clf2.partial_fit(shuffledX[batch[0]:batch[-1]+1], shuffledY[batch[0]:batch[-1]+1], classes=numpy.unique(Y))

Checkout this example I found online.

Happy Learning!

1 Like

Hello @CrazyRabbit
Thank you for the quite useful resource and I will definitely try this model.

1 Like

Happy to help @aakritiaggarwal !

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.

1 Like