Memory Error on Training Dataset

I am getting a memory error in notebook when i try to fit the Count Vectorizer on the training dataset

hi @S18CRX0180
please share the code with me
most probable you are converting the vectorized vocabulary to numpy array simultaneously

x_vec = cv.fit_transform(x_clean)
x_arr = x_vec.toarray()
This is the code

hi @S18CRX0180
try it on colab or any pc with greater ram