Sir batches generated by imagedatagenerator.flow_from_directory() doenot fit in model.fit()

sir in my validation folder i have 400 images from 4 classes and when i pass model.fit(…,…, validation_data=imagedatagenerator.flow_from_directory(…),), the erro comes that “When passing validation_data, it must contain 2 (x_val, y_val) or 3 (x_val, y_val, val_sample_weights) items, however it contains 13 items”
although i have tried to change the batch size from 32 to 256 and 512 alse for val_data but error doesnot remove.

here is the colab link:
https://colab.research.google.com/drive/1iEMeZrBzbHMNeWHvE08Dn7V4G_oqc-2a#scrollTo=bh5KA_FRgEsl

hey @abubakar_nsit ,
can you please provide access to your colab link , as i am unable to access it.
You can also share me its github link , if it is there.

Thank You and Happy Learning :slightly_smiling_face:.

hey @abubakar_nsit ,
can you please upload your dataset on drive or more preferably on github and share me its link , as without it i wont be able to run and test your code.

Thank You and Happy Learning :slightly_smiling_face:.

sir, basically the data.zip have two folder val and train and each of them contains 4 folders(‘frog’,'fish,‘boat’,‘spider’) and train folder contains 400X4 images whereas val contains 100X4 images
sir i have uploaded the data on the colab notebook also, but data might be lost as the notebook refreshes.

here is the github link for the same sir,

hey @abubakar_nsit ,
when you are fitting the model , either pass both training and validation as numpy arrays or both as generators, mixing them sometimes leads to misconception and hence an error is raised.
so , i would suggest you to kindly change that.
To help you a bit , i have implemented it here.

Kindly go through this once. I haven’t tuned it , hence it is providing such poor results. But you can understand the work of generators well.

I hope this would have resolved your doubt.
Thank You and Happy Coding :slightly_smiling_face:.

sir basically i was trying to append 5 (i.e. makeing train data 1600X4X5) augmented image of each image in the train file,and then train our model on this data, and validate our model with original val folder which contains(400X4) images,sir how i can achieve this?

so basically you want to create 5 augmentations of each image. is that correct ?

yes sir, and train the model on total train data(which would become (5+1)X images

So , to do that there 2 ways :

  1. Either perform generator.flow over every image , create 5 augmentations and store those images in another folder.
  2. Just change your generator batch_size to num_classes*number of augmentations you need ,
    if you want to store them too in real time ( while your model is training and creating augmentations ) , then add a parameter save_to_dir in flow_from_directory function to save those generated images in a specified directory.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.