Chevron Left
Back to Bayesian Statistics: Mixture Models

Learner Reviews & Feedback for Bayesian Statistics: Mixture Models by University of California, Santa Cruz

4.6
stars
47 ratings

About the Course

Bayesian Statistics: Mixture Models introduces you to an important class of statistical models. The course is organized in five modules, each of which contains lecture videos, short quizzes, background reading, discussion prompts, and one or more peer-reviewed assignments. Statistics is best learned by doing it, not just watching a video, so the course is structured to help you learn through application. Some exercises require the use of R, a freely-available statistical software package. A brief tutorial is provided, but we encourage you to take advantage of the many other resources online for learning R if you are interested. This is an intermediate-level course, and it was designed to be the third in UC Santa Cruz's series on Bayesian statistics, after Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should have some knowledge of and comfort with calculus-based probability, principles of maximum-likelihood estimation, and Bayesian estimation....

Top reviews

SM

Jan 19, 2021

I learned a lot about bayesian mixture model, expectation maximization, and MCMC algorithms and their use case in classification and clustering problems. I highly recommend this course.

RL

Feb 10, 2023

I really enjoyed this course! Plenty of examples on how to use Mixture Models in a Machine Learning context. Thanks to Abel and his team for putting together such an useful course.

Filter by:

1 - 18 of 18 Reviews for Bayesian Statistics: Mixture Models

By Rohit D

•

Jun 19, 2020

Bayesian Statistics: Mixture Models (BS3 for short)

In June 2020, BS3 is a new class. It appears that this class came to Coursera circa April 2020.

The class creators (Prof. Abel Rodriguez and others) have done an excellent job of pulling-together the requisite theory (video lectures) and practice (assignments in R).

For most people, including those with a modest amount of training in statistics or computer science, this class will feel like an advanced class. To reasonably comprehend the material, one needs to be familiar with Monte Carlo simulations (specifically Gibbs Sampling) and a broad spectrum of probability distributions (Poisson, Beta, Gamma, Inverse-Gamma, Log-Normal, Dirichlet) used in Bayesian statistics. The first two Bayesian Statistics classes cover most of these pre-requisites well.

BS3 delves into two ways of estimating mixtures, namely Expectation-Maximization (EM) and Gibbs sampling, and comparing results from the alternate approaches. BS3 does not stop at a "Gaussian Mixture of two Univariate distributions." Through its assignments, this class motivates the need for other mixture models such as zero-inflated Poisson distribution, a mixture of exponential and Log-Normal distribution, and a mixture of multivariate Gaussian distributions.

Some assignments require manipulation of hierarchical probability distributions using multiple techniques - Maximum Likelihood Estimation, detecting Conjugate Priors, Simulations - simultaneously. Since the manipulations are coded in R and need to achieve a numerical result, typos and algebraic manipulation errors are unforgiving.

The class organizers chose to have graded assignments (six in all) peer-reviewed. The peer review requirement can feel like a constraint for a class that is relatively new and advanced, and thus has low attendance.

It took me ~60 hours to complete this class over approximately two weeks. Ideally, I would have preferred to spread the course out over the recommended five-weeks. Life constraints dictated otherwise. Even so, the effort is well worth it. I am walking away with a much better appreciation of Bayesian Statistics in general and Mixture Models in particular.

By zj s

•

Aug 31, 2020

This course seems to be a new course only opened in recent months. Compared with other courses on coursera, its attention is not very high. The course mainly introduces the (Bayesian) generative mixture models and the method of parameter inference (EM/MCMC, not involving variational inference). Strictly speaking, it is a small branch of machine learning.

This is a course that combines principles and practice. It mainly uses Gaussian mixture/zero inflated mixture models as examples to explain its principles and derivations, and is supplemented by demonstration codes to help readers understand. At the same time, there are corresponding homework to help readers better understand related concepts. If the homework is done carefully, most of the knowledge points involved in this course should be mastered.

The difficulty of the course is moderate for me. According to coursera, it is advanced level, and the official estimate is about 21 hours. Of course some pre-knowledge is required.

By Murray S

•

May 31, 2022

While I liked the instructor and the presentation of the topics, I did find it somewhat tedious to sit through the instructor writing out the various derivations on the board. I think the topic would be better served by providing the key results on summary slides and perhaps providing the derivations in an appendix that the student could download. The instructor provided notes for the course, which are excellent (in general, I find the lack of notes or references to useful books to be a gap in many Coursera courses).

By Tomas F

•

Dec 21, 2021

good course, but there are few students taking it, so when you need to advance and make student-based reviews it gets difficult to have your work qualified or reviewing others.

By piaoyang

•

Dec 11, 2021

This course focuses on the estimation and application of Mixture Model. Estimation includes EM and MCMC, and applications include density estimation, clustering, and classification. And how to compare different Mixture Models during application, and what to pay attention to when using. Generally speaking, the first two courses are the foundation of this course, and after finishing this course, you can truly understand the application of Bayesian statistics. For me, after studying this course, I really understand how the LDA topic model is built and solved. Although this course only talks about two parameter estimation methods, there are many variations in reality, but they are all based on them. At the beginning, I felt that this course was a bit difficult, but the text materials provided in the course were supplemented in some places that I didn't understand, so I learned everything in the end. The code assignments of the course also let me understand how these algorithms are implemented, and also let me see its specific practical applications. A really great course. Although it is difficult, you can understand it through the course videos and materials, plus a period of thinking.

The only downside is that the waiting time for peer review is too long. I have completed all the assignments and submitted them nearly a month ago, but I have not completed the peer review until now.

By Rajendra A

•

Jan 26, 2021

Excellent course material, video lectures and programming assignments. Learnt EM and MCMC. Initially thought that the programming assignments would be difficult but after following videos and instructions, I started gaining confidence. Highly recommending this course.

By Rubén A G L

•

Feb 10, 2023

I really enjoyed this course! Plenty of examples on how to use Mixture Models in a Machine Learning context. Thanks to Abel and his team for putting together such an useful course.

By Chow K M

•

May 18, 2021

Definitely quite mathematical in nature. Good way to learn about expectation-maximisation algorithm.

By Pavel K

•

Oct 23, 2023

The course focuses more on practical aspects, giving a general view of the topic. The course covers EM/MCMC algorithms for learning mixture of distributions models and discusses some of their pitfalls. Examples of using these models (density estimation, (semi-) supervised classification and clustering) are also given, as well as an introductory understanding of the procedure for selecting the number of components. Unfortunately, there are certain disadvantages in the course. In particular, peer-review submission tasks, where the grade depends heavily on fellow students, their competences and their presence at the course. Even they are optional (honor), these assignments could have been designed as automatically graded programming ones, and then the resulting grades would have been more honest and without the long wait for at least some reviewers.

By Marcelo B

•

Jul 31, 2020

Good course, it goes deep whenever it is needed. Great lectures. The exercises are presented according to the lectures. I have enjoyed the course and would suggest it to anybody who wants to close some corners in the Bayesian statistics.

By Suraj M

•

Jan 20, 2021

I learned a lot about bayesian mixture model, expectation maximization, and MCMC algorithms and their use case in classification and clustering problems. I highly recommend this course.

By Cameron D K

•

Oct 15, 2023

Great course. The professor is excellent. The material is fairly advanced so make sure to take the prerequisites.

By Dongliang Z

•

Apr 14, 2023

Excellent course to illustrate the model step by step. I like the process of derivation.

By Rick S

•

Jul 1, 2021

Just enough theory and practice. Great class.

By Jaime A C

•

Jan 12, 2022

The content of the course is amazing. I've learned a lot and the explanations and exercises were well designed and interesting. The reason I rate 4 star is that there are a lot of peer review tasks, which is something that I don't like at all, for several reasons. One important reason is that you cannot get a honors certificate unless you have peers to review, and I found no project to review even if I completed all asignments.

By Dziem N

•

Jun 16, 2021

This course is the best of the series from UC Santa Cruz. The lecturer explains the rather complicated concepts with clarity. I find the examples are really helpful to further grasp the concepts.

Thank you...

By Rahul S

•

Apr 3, 2021

Very good course.

By Daniel V

•

Sep 25, 2023

The content is very dense and I missed more practical examples on how to apply the concept.