Memory Error in Data Augmentation Code


MemoryError Traceback (most recent call last)
in ()
----> 1 X_val,Y_val = load_validation_data((224,224), 200)

in load_validation_data(target_size, no_of_classes)
4
5 m = len(lines)
----> 6 X = np.empty((m,*target_size,3))
7 Y = np.empty(m)
8

MemoryError: Unable to allocate 11.2 GiB for an array with shape (10000, 224, 224, 3) and data type float64

Here is my code-
https://colab.research.google.com/drive/1MPmDL2mpQOkeQbiNYD5pTVkjCDZmPOih?usp=sharing

Hey @sanchit123manchanda, try to reduce the number of images that are being generated by the ImageDataGenerator or I would say allocate 24GB RAM to your google colaboratory once it crashes.

I hope this helps ! :slight_smile:
Please mark the doubt as resolved in your doubts section ! :+1:
Happy Learning ! :slight_smile:

I was doing this on Jupyter notebook as I am not able to understand how can I use the complete tiny image folder in Google collab. Is there any way to do that ?

Yes absolutely. You can upload the dataset on your google drive and simply access it from your google colaboratory notebook by mounting the notebook onto the drive. Please upload the dataset first on google drive and then search for " how to mount your colab notebook on your drive" .

If you face any issue, feel free to ask me :+1:
Happy Learning ! :slightly_smiling_face:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.