Effect on accuracy of the model?

If I use mini batch will my model be less accurate than that obtained from gradient descent?

hi Sagnik,
we mini batch is an extansion of gradient descent
in gradient descent we iterate over whole data and then back propogate. while in mini batch we process on small batches.
in mini batch we don’t have to keep whole data in main memory it can be kept in batches one at a time.

=The model update frequency is higher than batch gradient descent which allows for a more robust convergence, avoiding local minima.
=The batched updates provide a computationally more efficient process than stochastic gradient descent.
=The batching allows both the efficiency of not having all training data in memory and algorithm implementations.
"mini batch increases the accuracy of the model "

Bur the gradient computed in the mini batch gradient descent is approximate. So how can it give better accuracy?
And how to decide that which algorithm to apply looking at at the data set. Should it be based on the size of the data set?

Sure one update with a big minibatch is “better” (in terms of accuracy) than one update with a small minibatch Agreed.
The size of mini-batches is essentially the frequency of updates: the smaller minibatches require more updates. but bigger minibatches are suited for more efficient parallelization.

Deciding the algorithm depends on many factors like u machine limits, ur model inputs and outputs.
I sgd u have to load whole data at once in memory , due to memory limit we use mini batch ,but for that we have to sacrifice accuracy to some extent .