Embedding in word2vec

what is embedding? and vector embedding?

hi @9956596908
Word Embedding is a type of word representation that allows words with similar meaning to be understood by machine learning algorithms.
it is a mapping of words into vectors of real numbers using the neural network, probabilistic model, or dimension reduction on word co-occurrence matrix.