One of the key ideas in NLP is how we can efficiently convert words into numeric vectors which can then be “fed into” various machine learning models to perform predictions. The current key technique to do this is called “Word2Vec”. Andy Thomas on Word2Vec word embedding in Python and TensorFlow