we have 10 epochs so does batch_size mean in every epoch , we pass no. of images equal to batch_size?
What does batch_size represent?
Hey @ishabehera,no let me explain you this :
One Epoch is when an ENTIRE dataset is passed forward and backward through the neural network only ONCE. Since one epoch is too big to feed to the computer at once we divide it in several smaller batches. So batch size is basically, Total number of training examples present in a single batch.
Now you must be confused by the term iterations. Let me explain that as well. Iterations is the number of batches needed to complete one epoch.
Note: The number of batches is equal to number of iterations for one epoch.
Example : Let’s say we have 2000 training examples that we are going to use .
Then, we can divide the dataset of 2000 examples into batches of 500 then it will take 4 iterations to complete 1 epoch. (batch size here becomes 500)
I hope this clears your doubt and explains you the concept !
Happy Learning
I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.
On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.