参考:Word2Vec and FastText Word Embedding with Gensim
参考:An Intuitive Understanding of Word Embeddings: From Count Vectors to Word2Vec
参考:What the heck is Word Embedding
Word Embeddings are numerical representations of texts.
Word Embeddings are dense representations of the individual words in a text, taking into account the context and other surrounding words that that individual word occurs with.
总结下来,词嵌入的含义就是文本的密集向量表示,但是中文翻译让人有点不好理解,其实就是将文本用数字表示的意思。