Master Thesis: Recurrent Neural Networks for Natural Language Processing (Glavaš, Ponzetto)

This thesis should provide an in-depth overview of the various recurrent neural network models (fully recurrent networks, recursive networks, long short-term memory networks, etc.) and it’s variants (bidirectionality, attention-based extensions) that are used in different natural language processing tasks. The thesis should analyse in detail all relevant models (emphasizing the advantages and shortcomings for each of them) and the NLP tasks in which these models have been successfully applied, i.e., tasks in which these models achieve state-of-the-art performance. The candidate is also expected to focus on  the implementation of one of the RNN models and its application state-of-the-art models and it’s application on one particular NLP task.