Neural networks and deep learning have become state of the art in several domains of machine learning. Natural language processing (NLP) is no exception. In particular, NLP has seen recurrent neural networks make significant advances due to the sequential nature of the data. Famous examples are Siri, Google Now, and Cortana.
At data2day/2016 in Karlsruhe, Daniel Kirsch gave a fascinating talk and answered the following questions:
- How do RNNs function?
- For what kind of tasks are they helpful?
- How do you train and apply them?
Basic understanding of artificial neural networks is advantageous.