Über diesen Kurs
365 Bewertungen
84 Bewertungen

100 % online

Beginnen Sie sofort und lernen Sie in Ihrem eigenen Tempo.

Flexible Fristen

Setzen Sie Fristen gemäß Ihrem Zeitplan zurück.

Stufe „Fortgeschritten“

Ca. 34 Stunden zum Abschließen

Empfohlen: 5 weeks of study, 4-5 hours per week...


Untertitel: Englisch

Kompetenzen, die Sie erwerben

ChatterbotTensorflowDeep LearningNatural Language Processing

100 % online

Beginnen Sie sofort und lernen Sie in Ihrem eigenen Tempo.

Flexible Fristen

Setzen Sie Fristen gemäß Ihrem Zeitplan zurück.

Stufe „Fortgeschritten“

Ca. 34 Stunden zum Abschließen

Empfohlen: 5 weeks of study, 4-5 hours per week...


Untertitel: Englisch

Lehrplan - Was Sie in diesem Kurs lernen werden

5 Stunden zum Abschließen

Intro and text classification

In this module we will have two parts: first, a broad overview of NLP area and our course goals, and second, a text classification task. It is probably the most popular task that you would deal with in real life. It could be news flows classification, sentiment analysis, spam filtering, etc. You will learn how to go from raw texts to predicted classes both with traditional methods (e.g. linear classifiers) and deep learning techniques (e.g. Convolutional Neural Nets)....
11 Videos (Gesamt 114 min), 3 Lektüren, 3 Quiz
11 Videos
Welcome video5m
Main approaches in NLP7m
Brief overview of the next weeks7m
[Optional] Linguistic knowledge in NLP10m
Text preprocessing14m
Feature extraction from text14m
Linear models for sentiment analysis10m
Hashing trick in spam filtering17m
Neural networks for words14m
Neural networks for characters8m
3 Lektüren
Prerequisites check-list2m
Hardware for the course5m
Getting started with practical assignments20m
2 praktische Übungen
Classical text mining10m
Simple neural networks for text10m
5 Stunden zum Abschließen

Language modeling and sequence tagging

In this module we will treat texts as sequences of words. You will learn how to predict next words given some previous words. This task is called language modeling and it is used for suggests in search, machine translation, chat-bots, etc. Also you will learn how to predict a sequence of tags for a sequence of words. It could be used to determine part-of-speech tags, named entities or any other tags, e.g. ORIG and DEST in "flights from Moscow to Zurich" query. We will cover methods based on probabilistic graphical models and deep learning....
8 Videos (Gesamt 84 min), 2 Lektüren, 3 Quiz
8 Videos
Perplexity: is our model surprised with a real text?8m
Smoothing: what if we see new n-grams?7m
Hidden Markov Models13m
Viterbi algorithm: what are the most probable tags?11m
MEMMs, CRFs and other sequential models for Named Entity Recognition11m
Neural Language Models9m
Whether you need to predict a next word or a label - LSTM is here to help!11m
2 Lektüren
Perplexity computation10m
Probabilities of tag sequences in HMMs20m
2 praktische Übungen
Language modeling15m
Sequence tagging with probabilistic models20m
5 Stunden zum Abschließen

Vector Space Models of Semantics

This module is devoted to a higher abstraction for texts: we will learn vectors that represent meanings. First, we will discuss traditional models of distributional semantics. They are based on a very intuitive idea: "you shall know the word by the company it keeps". Second, we will cover modern tools for word and sentence embeddings, such as word2vec, FastText, StarSpace, etc. Finally, we will discuss how to embed the whole documents with topic models and how these models can be used for search and data exploration....
8 Videos (Gesamt 83 min), 3 Quiz
8 Videos
Explicit and implicit matrix factorization13m
Word2vec and doc2vec (and how to evaluate them)10m
Word analogies without magic: king – man + woman != queen11m
Why words? From character to sentence embeddings11m
Topic modeling: a way to navigate through text collections7m
How to train PLSA?6m
The zoo of topic models13m
2 praktische Übungen
Word and sentence embeddings15m
Topic Models10m
5 Stunden zum Abschließen

Sequence to sequence tasks

Nearly any task in NLP can be formulates as a sequence to sequence task: machine translation, summarization, question answering, and many more. In this module we will learn a general encoder-decoder-attention architecture that can be used to solve them. We will cover machine translation in more details and you will see how attention technique resembles word alignment task in traditional pipeline....
9 Videos (Gesamt 98 min), 4 Quiz
9 Videos
Noisy channel: said in English, received in French6m
Word Alignment Models12m
Encoder-decoder architecture6m
Attention mechanism9m
How to deal with a vocabulary?12m
How to implement a conversational chat-bot?11m
Sequence to sequence learning: one-size fits all?10m
Get to the point! Summarization with pointer-generator networks12m
3 praktische Übungen
Introduction to machine translation10m
Encoder-decoder architectures20m
Summarization and simplification15m
84 BewertungenChevron Right


nahm einen neuen Beruf nach Abschluss dieser Kurse auf


ziehen Sie für Ihren Beruf greifbaren Nutzen aus diesem Kurs


erhalten Sie eine Gehaltserhöhung oder Beförderung


von GYMar 24th 2018

Great thanks to this amazing course! I learned a lot on state-to-art natural language processing techniques! Really like your awesome programming assignments! See you HSE guys in next class!

von MVMar 18th 2019

Definitely best course in the Specialization! Lecturers, projects and forum - everything is super organized. Only StarSpace was pain in the ass, but I managed :)



Anna Potapenko

HSE Faculty of Computer Science

Alexey Zobnin

Accosiate professor
HSE Faculty of Computer Science

Anna Kozlova

Team Lead

Sergey Yudin


Andrei Zimovnov

Senior Lecturer
HSE Faculty of Computer Science

Über National Research University Higher School of Economics

National Research University - Higher School of Economics (HSE) is one of the top research universities in Russia. Established in 1992 to promote new research and teaching in economics and related disciplines, it now offers programs at all levels of university education across an extraordinary range of fields of study including business, sociology, cultural studies, philosophy, political science, international relations, law, Asian studies, media and communicamathematics, engineering, and more. Learn more on www.hse.ru...

Über die Spezialisierung Erweiterte maschinelles Lernen

This specialization gives an introduction to deep learning, reinforcement learning, natural language understanding, computer vision and Bayesian methods. Top Kaggle machine learning practitioners and CERN scientists will share their experience of solving real-world problems and help you to fill the gaps between theory and practice. Upon completion of 7 courses you will be able to apply modern machine learning methods in enterprise and understand the caveats of real-world data and settings....
Erweiterte maschinelles Lernen

Häufig gestellte Fragen

  • Sobald Sie sich für ein Zertifikat angemeldet haben, haben Sie Zugriff auf alle Videos, Quizspiele und Programmieraufgaben (falls zutreffend). Aufgaben, die von anderen Kursteilnehmern bewertet werden, können erst dann eingereicht und überprüft werden, wenn Ihr Unterricht begonnen hat. Wenn Sie sich den Kurs anschauen möchten, ohne ihn zu kaufen, können Sie womöglich auf bestimmte Aufgaben nicht zugreifen.

  • Wenn Sie sich für den Kurs anmelden, erhalten Sie Zugriff auf alle Kurse der Spezialisierung und Sie erhalten nach Abschluss aller Arbeiten ein Zertifikat. Ihr elektronisches Zertifikat wird zu Ihrer Seite „Errungenschaften“ hinzugefügt – von dort können Sie Ihr Zertifikat ausdrucken oder es zu Ihrem LinkedIn Profil hinzufügen. Wenn Sie nur lesen und den Inhalt des Kurses anzeigen möchten, können Sie kostenlos als Gast an dem Kurs teilnehmen.

Haben Sie weitere Fragen? Besuchen Sie das Hilfe-Center für Teiln..