Natural Language Processing with Deep Learning
Natural Language Processing with Deep Learning


Applications which make use of Natural Language Processing (NLP) algorithms have increasedover the last decade. With the rapid growth of artificial intelligence assistants and the tendency of companies to enhance their services with more interactive human-machine interactions, it is mandatory to understand how NLP techniques can be used to manipulate, analyze and create data which are based on text. Modern techniques can capture the tone, context, and refinement of language, as humans do.And if they are properly designed, developers can exploit these techniques to create powerful NLP applications which provide natural and seamless human-computer interactions with chatbots, intelligent software agents and much more. Deep Learning models have gained widespread acceptance for NLP operations due to their ability to generalize accurately across a range of contexts and languages. Transformer-based models, such as the Bidirectional Encoder Representations from Transformers (BERT) model, have revolutionized the NLP by providing accuracy comparable to human reference lines at reference points such as the SQuAD dataset for Q&A, entity recognition, recognition of intentions, sentiment analysis etc. In this course, students will become familiar with language processing techniques, word embeddings as well as they will learn to apply, train and correct their own Deep Neural Models.The notes as well as the proposed exercises, are based on the teaching material and suggestions which have been developed for this purpose by the NVIDIA Deep Learning Institute(

  • Introduction (NLP Applications)
  • Text preprocessing, BOW, TF-IDF
  • Dimensionality reduction for representation learning
  • Word Vectors (embeddings)
  • Recurrent Neural Networks (RNNs), Long-Short Term Memory (LSTM)
  • Attention mechanism (global, local, self, multi-head)
  • Language Models (ELMo, ULMFiT), Transformers (BERT, RoBERTa, ELECTRA etc), Autoregressive Models (XLNET, GPT-2 etc)
  • Word Embeddings (CBOW, SkipGram)
  • Text Classification
  • Named Entity Recognition
  • Neural Machine Translation
  • Text Generation
  • Semantic Textual Similarity
  • Fact Verification
  • Question/Answering for Chatbots


Course Features

Course type: Minor

Semester: 2nd


Duration: 13 weeks  

Courses: Instructor-led + online

Language: Greek with English notes

Assessment: Project based



Skip to content