Post date: Nov 08, 2017 8:1:28 AM
K Hong wrote a nice blog here. The Jupyter notebook is here.
One of the key ideas in NLP is how we can efficiently convert words into numeric vectors which can then be “fed into” various machine learning models to perform predictions. The current key technique to do this is called “Word2Vec”. Andy Thomas on Word2Vec word embedding in Python and TensorFlow
The most popular method of performing classification and other analysis on sequences of data is recurrent neural networks. Andy Thomas on Recurrent neural networks and LSTM tutorial in Python and TensorFlow.