Memory Error in CountVectoriser

When i am using the countvectoriser for the training set , after cleaning the text , using the already made clean_text.py file, I am getting a MemoryError
How should i resolve this?

hi @Gopesh-Khandelwal-1098077763666198

plz remove stop words, lematize words : this will reduce number of words in vocabulory
also remember that u are countvectorizing the sentences not the words.
if problem still persisits than try running the code on colab : because this is due to lack of storage

Even this happend with me as well, then i tried running this on 16gb ram , and it worked. one stackoverflow answer said :- because you are storing so much of data in a single variable, you memory won’t be able to handle that.
I’m not very confirm if this is correct or not.

Also i tried this - do not write .toarray() with countvectorizer , let it be a sparse matrix, this also worked for me.