What if validation set was very big and could not be loaded in memory?

What will be the approach at that time ?
What i could think of was …

  1. Create empty folders for each class[ train_gen.indices.keys() ]
  2. Load data in small batches that will and put the images in appropriate folders
    Then use an ImageDataGenerator object to send the data to fit_generator function

Kindly correct my approach if it is not the best approach

Hey @chiragwxN, yes i would also do something like this.

Kudos! great job done.

Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section. :blush:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.