Showing memory error while using count vectorizer

the code is showing error :

cv=CountVectorizer()
xcv=cv.fit_transform(xc).toarray()

error:

MemoryError Traceback (most recent call last)
in
1 cv=CountVectorizer()
----> 2 xcv=cv.fit_transform(xc).toarray()

~\Anaconda3\lib\site-packages\scipy\sparse\compressed.py in toarray(self, order, out)
1022 if out is None and order is None:
1023 order = self._swap(‘cf’)[0]
-> 1024 out = self._process_toarray_args(order, out)
1025 if not (out.flags.c_contiguous or out.flags.f_contiguous):
1026 raise ValueError(‘Output array must be C or F contiguous’)

~\Anaconda3\lib\site-packages\scipy\sparse\base.py in _process_toarray_args(self, order, out)
1184 return out
1185 else:
-> 1186 return np.zeros(self.shape, dtype=self.dtype, order=order)
1187
1188

MemoryError:

Hey @dibakarchaudhary58, this is due to ram memory overflow meaning your storage requirements are higher than ram space has. I would recommend switching to google colab they will provide you 25 gb of ram. You can also prefer to use sparse matrices.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.

I too faced the same problem in colab while using fit_transform() in machine learning.Can anyone help me??