Blog‎ > ‎

10 Best Links: Word Embeddings

posted Nov 2, 2018, 8:32 AM by MUHAMMAD MUN`IM AHMAD ZABIDI   [ updated Dec 3, 2018, 8:55 PM ]
A popular idea in modern machine learning is to represent words by vectors. These vectors capture hidden information about a language, like word analogies or semantic.
  1. An introduction to word embeddings
  2. Introduction to Word Embeddings: Problems and Theory
  3. [Hamilton 2016] Hamilton, William L., et al. “Inducing domain-specific sentiment lexicons from unlabeled corpora.” arXiv preprint arXiv:1606.02820 (2016).
  4. [Kusner 2015] Kusner, Matt, et al. “From word embeddings to document distances.” International Conference on Machine Learning. 2015.
  5. [Mikolov 2013a] Mikolov, Tomas, Wen-tau Yih, and Geoffrey Zweig. “Linguistic regularities in continuous space word representations.” hlt-Naacl. Vol. 13. 2013.
  6. [Mikolov 2013b] Mikolov, Tomas, et al. “Efficient estimation of word representations in vector space.” arXiv preprint arXiv:1301.3781 (2013).
  7. [Mikolov 2013c] Mikolov, Tomas, et al. “Distributed representations of words and phrases and their compositionality.” Advances in neural information processing systems. 2013.
  8. [Mikolov 2013d] Mikolov, Tomas, Quoc V. Le, and Ilya Sutskever. “Exploiting similarities among languages for machine translation.” arXiv preprint arXiv:1309.4168 (2013).