Word Embeddings

How word embeddings get trained automatically when we do any NLP task
I want to know behind the scenes.

Basically while using embedding layer from the keras,
layers.Embedding(vocab_size, embedding_dim),
so this is more or less similar to a neural network only, where our vocab size having very large values i.e. 10k (for example), are represented only using the embedding_dim. Consider this as feature extraction where we extract important features from the images. The same thing is going here.

Hope this helped :blush:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.