Über diesen Kurs
75,114 recent views

100 % online

Beginnen Sie sofort und lernen Sie in Ihrem eigenen Tempo.

Flexible Fristen

Setzen Sie Fristen gemäß Ihrem Zeitplan zurück.

Stufe „Fortgeschritten“

Course requires strong background in calculus, linear algebra, probability theory and machine learning.


Untertitel: Englisch, Koreanisch

Kompetenzen, die Sie erwerben

Bayesian OptimizationGaussian ProcessMarkov Chain Monte Carlo (MCMC)Variational Bayesian Methods

100 % online

Beginnen Sie sofort und lernen Sie in Ihrem eigenen Tempo.

Flexible Fristen

Setzen Sie Fristen gemäß Ihrem Zeitplan zurück.

Stufe „Fortgeschritten“

Course requires strong background in calculus, linear algebra, probability theory and machine learning.


Untertitel: Englisch, Koreanisch

Lehrplan - Was Sie in diesem Kurs lernen werden

2 Stunden zum Abschließen

Introduction to Bayesian methods & Conjugate priors

Welcome to first week of our course! Today we will discuss what bayesian methods are and what are probabilistic models. We will see how they can be used to model real-life situations and how to make conclusions from them. We will also learn about conjugate priors — a class of models where all math becomes really simple.

9 Videos (Gesamt 55 min), 1 Lektüre, 2 Quiz
9 Videos
Bayesian approach to statistics5m
How to define a model3m
Example: thief & alarm11m
Linear regression10m
Analytical inference3m
Conjugate distributions2m
Example: Normal, precision5m
Example: Bernoulli4m
1 Lektüre
MLE estimation of Gaussian mean10m
2 praktische Übungen
Introduction to Bayesian methods20m
Conjugate priors12m
6 Stunden zum Abschließen

Expectation-Maximization algorithm

This week we will about the central topic in probabilistic modeling: the Latent Variable Models and how to train them, namely the Expectation Maximization algorithm. We will see models for clustering and dimensionality reduction where Expectation Maximization algorithm can be applied as is. In the following weeks, we will spend weeks 3, 4, and 5 discussing numerous extensions to this algorithm to make it work for more complicated models and scale to large datasets.

17 Videos (Gesamt 168 min), 3 Quiz
17 Videos
Probabilistic clustering6m
Gaussian Mixture Model10m
Training GMM10m
Example of GMM training10m
Jensen's inequality & Kullback Leibler divergence9m
Expectation-Maximization algorithm10m
E-step details12m
M-step details6m
Example: EM for discrete mixture, E-step10m
Example: EM for discrete mixture, M-step12m
Summary of Expectation Maximization6m
General EM for GMM12m
K-means from probabilistic perspective9m
K-means, M-step7m
Probabilistic PCA13m
EM for Probabilistic PCA7m
2 praktische Übungen
EM algorithm8m
Latent Variable Models and EM algorithm10m
2 Stunden zum Abschließen

Variational Inference & Latent Dirichlet Allocation

This week we will move on to approximate inference methods. We will see why we care about approximating distributions and see variational inference — one of the most powerful methods for this task. We will also see mean-field approximation in details. And apply it to text-mining algorithm called Latent Dirichlet Allocation

11 Videos (Gesamt 98 min), 2 Quiz
11 Videos
Mean field approximation13m
Example: Ising model15m
Variational EM & Review5m
Topic modeling5m
Dirichlet distribution6m
Latent Dirichlet Allocation5m
LDA: E-step, theta11m
LDA: E-step, z8m
LDA: M-step & prediction13m
Extensions of LDA5m
2 praktische Übungen
Variational inference15m
Latent Dirichlet Allocation15m
5 Stunden zum Abschließen

Markov chain Monte Carlo

This week we will learn how to approximate training and inference with sampling and how to sample from complicated distributions. This will allow us to build simple method to deal with LDA and with Bayesian Neural Networks — Neural Networks which weights are random variables themselves and instead of training (finding the best value for the weights) we will sample from the posterior distributions on weights.

11 Videos (Gesamt 122 min), 2 Quiz
11 Videos
Sampling from 1-d distributions13m
Markov Chains13m
Gibbs sampling12m
Example of Gibbs sampling7m
Metropolis-Hastings: choosing the critic8m
Example of Metropolis-Hastings9m
Markov Chain Monte Carlo summary8m
MCMC for LDA15m
Bayesian Neural Networks11m
1 praktische Übung
Markov Chain Monte Carlo20m
5 Stunden zum Abschließen

Variational Autoencoder

Welcome to the fifth week of the course! This week we will combine many ideas from the previous weeks and add some new to build Variational Autoencoder -- a model that can learn a distribution over structured data (like photographs or molecules) and then sample new data points from the learned distribution, hallucinating new photographs of non-existing people. We will also the same techniques to Bayesian Neural Networks and will see how this can greatly compress the weights of the network without reducing the accuracy.

10 Videos (Gesamt 79 min), 3 Lektüren, 3 Quiz
10 Videos
Modeling a distribution of images10m
Using CNNs with a mixture of Gaussians8m
Scaling variational EM15m
Gradient of decoder6m
Log derivative trick6m
Reparameterization trick7m
Learning with priors5m
Dropout as Bayesian procedure5m
Sparse variational dropout5m
3 Lektüren
VAE paper10m
Relevant papers10m
Categorical Reparametrization with Gumbel-Softmax10m
2 praktische Übungen
Variational autoencoders16m
Categorical Reparametrization with Gumbel-Softmax18m
4 Stunden zum Abschließen

Gaussian processes & Bayesian optimization

Welcome to the final week of our course! This time we will see nonparametric Bayesian methods. Specifically, we will learn about Gaussian processes and their application to Bayesian optimization that allows one to perform optimization for scenarios in which each function evaluation is very expensive: oil probe, drug discovery and neural network architecture tuning.

7 Videos (Gesamt 58 min), 2 Quiz
7 Videos
Gaussian processes8m
GP for machine learning5m
Derivation of main formula11m
Nuances of GP12m
Bayesian optimization10m
Applications of Bayesian optimization5m
1 praktische Übung
Gaussian Processes and Bayesian Optimization16m
5 Stunden zum Abschließen

Final project

In this module you will apply methods that you learned in this course to this final project

1 Quiz
95 BewertungenChevron Right


nahm einen neuen Beruf nach Abschluss dieser Kurse auf


ziehen Sie für Ihren Beruf greifbaren Nutzen aus diesem Kurs

Top reviews from Bayesian Methods for Machine Learning

von JGNov 18th 2017

This course is little difficult. But I could find very helpful.\n\nAlso, I didn't find better course on Bayesian anywhere on the net. So I will recommend this if anyone wants to die into bayesian.

von LBJun 7th 2019

Excellent course! The perfect balance of clear and relevant material and challenging but reasonable exercises. My only critique would be that one of the lecturers sounds very sleepy.



Daniil Polykovskiy

HSE Faculty of Computer Science

Alexander Novikov

HSE Faculty of Computer Science

Über National Research University Higher School of Economics

National Research University - Higher School of Economics (HSE) is one of the top research universities in Russia. Established in 1992 to promote new research and teaching in economics and related disciplines, it now offers programs at all levels of university education across an extraordinary range of fields of study including business, sociology, cultural studies, philosophy, political science, international relations, law, Asian studies, media and communicamathematics, engineering, and more. Learn more on www.hse.ru...

Über die Spezialisierung Erweiterte maschinelles Lernen

This specialization gives an introduction to deep learning, reinforcement learning, natural language understanding, computer vision and Bayesian methods. Top Kaggle machine learning practitioners and CERN scientists will share their experience of solving real-world problems and help you to fill the gaps between theory and practice. Upon completion of 7 courses you will be able to apply modern machine learning methods in enterprise and understand the caveats of real-world data and settings....
Erweiterte maschinelles Lernen

Häufig gestellte Fragen

  • Sobald Sie sich für ein Zertifikat angemeldet haben, haben Sie Zugriff auf alle Videos, Quizspiele und Programmieraufgaben (falls zutreffend). Aufgaben, die von anderen Kursteilnehmern bewertet werden, können erst dann eingereicht und überprüft werden, wenn Ihr Unterricht begonnen hat. Wenn Sie sich den Kurs anschauen möchten, ohne ihn zu kaufen, können Sie womöglich auf bestimmte Aufgaben nicht zugreifen.

  • Wenn Sie sich für den Kurs anmelden, erhalten Sie Zugriff auf alle Kurse der Spezialisierung und Sie erhalten nach Abschluss aller Arbeiten ein Zertifikat. Ihr elektronisches Zertifikat wird zu Ihrer Seite „Errungenschaften“ hinzugefügt – von dort können Sie Ihr Zertifikat ausdrucken oder es zu Ihrem LinkedIn Profil hinzufügen. Wenn Sie nur lesen und den Inhalt des Kurses anzeigen möchten, können Sie kostenlos als Gast an dem Kurs teilnehmen.

  • Course requires strong background in calculus, linear algebra, probability theory and machine learning.

Haben Sie weitere Fragen? Besuchen Sie das Hilfe-Center für Teiln..