Why is every word a 50 dimesion vector?What each dimesion stores.Please Explain ASAP?
Glove Embedding
Hey @Management718,
GloVe is a word vector technique. Just to refresh, word vectors put words to a nice vector space, where similar words cluster together and different words repel. The advantage of GloVe is that, unlike Word2vec, GloVe does not rely just on local statistics (local context information of words), but incorporates global statistics (word co-occurrence) to obtain word vectors.
In the original paper, they trained with 25, 50, 100, 200, 300. These dimensions are not interpretable. after training, we are getting a vector with ādā dim that captures many properties of that word. If the dimension is increasing, the vector can capture much more information but computational complexity will also increase.