Chevron Left
Back to Natural Language Processing with Probabilistic Models

Learner Reviews & Feedback for Natural Language Processing with Probabilistic Models by DeepLearning.AI

4.7
stars
1,650 ratings

About the Course

In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

NM

Dec 12, 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).

HS

Dec 2, 2020

A neatly organized course introducing the students to basics of Processing text data, learning word embedding and most importantly on how to interpret the word embedding. Great Job!!

Filter by:

226 - 250 of 286 Reviews for Natural Language Processing with Probabilistic Models

By Nima M

Nov 6, 2020

The content of the course was really interesting an engaging. But the assignments mostly only helped in understanding the details of the algorithms and processes. It would have been nice to get to learn how to use state of the art libraries, which would've been more practical. Although, in fairness, anybody who completes this course should be able to make use of off-the-shelf libraries. Another point was that when the instructor was narrating the slides, his intonation was occasionally a bit off, making me lose track of the subject and having to re-listen few times.

By Haosheng Z

Aug 21, 2022

Personally speaking, this course is great. I have a background in Math so it would be very easy for me to infer all the mathematical details from a general thought or frame of the method, but I can imagine that people with other backgrounds may suffer from a lack in rigorous proofs in this course. Nevertheless, the course does provide new thoughts for me and make me familiar with some practically useful tricks in NLP. I would recommend this course if you are working in a field other than NLP and want to learn something about it or if you are a beginner to NLP.

By Евгений

May 22, 2023

Good and instructive course. Minor problem of this course is that authors tried to make it less intimidating for students that lack math skills, and that results in that some explanations are not rigorous. For example the principle of extraction of word embeddings from CBOW model is explained purely on the basis of dimensionality of weight matrices whereas it is leaves a lot of questions unless one studies supplementary materials.

By Yen L B

Sep 7, 2020

Good for the basics of NLP. Good mix of examples from classical NLP (e.g. n-grams) and neural nets (e.g. embeddings). As usual from deeplearning.ai, great motivating examples such as autocorrect and autocomplete to help us understand the materials. The neural net examples could do with more equations as in other deeplearning.ai courses.

By Shantimohan E

Dec 11, 2021

The quiz for week 1 contains topics from week 4. It has not been changed in 2 weeks that I was on this course. Except for this lacuna everything else was very nice. The course is well structured and the assignments made me to think and revise the course material thoroughly. In a nutshell this was an excellent course.

By Mares B

Dec 2, 2020

Thank you for the Lecture. I enjoyed it a lot! One thing I did not like too much was reading aloud and fast complex equations. I got distracted a lot when that happened. Also the Grade of the programming assignment is very slow and some additional verification of the programming task would be helpful.

By Anatoly L

Dec 4, 2021

There is a confusion in week 1 practical quiz. It seems that this task from week 4. There are conusions in contests but in general this is good course, because we come through the program from simple to difficult tasks and make necessary computings and functions from libriaries from scratch.

By Kostyantyn B

Oct 18, 2020

A good course overall. I wish the assignments were a bit more challenging though. Still, we have covered a lot of ground. And for those who know nothing about the word embeddings, I think this would be a perfect first course to take. So all in all, time well spent.

By James P

Sep 17, 2020

I found the course really helped to reinforce my understanding about importants concepts like n-grams, HMMs and word embeddings. The labs are pretty well spread out, and by the time you get to the week-ending assignments, you have all the info you need to complete.

By vijaya k e

Feb 3, 2022

It will benefit if we can apply the knowledge at work while learning. Fourmulas in videos, readings and assignments are sometimes different. There is almost no help in community forum if we are stuck with assignmernts. It helps if we get help from TAs.

By Manish S

Dec 28, 2022

I love this course, If you follow along with book An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition by Daniel Jurafsky and James H. Martin will help you to understand the core concept and mathematics

By Will H

Mar 31, 2021

The lectures on the Viterbi algorithm were a little wooden and there were no summary text (reading) tasks (as there often is in other deeplearning.ai courses), however this is a worthwile and informative course.

By Rafael C F d A

Jan 16, 2021

In the first and second week the exercices have some unecessery pranks in the data formatation just to make the exercice harded, but it take out the attention for what matter in the course that is NLP

By Germán M

Dec 30, 2020

Very good to see how the "from scratch" concepts are presented; nevertheless, I have the feeling that a very little "real use case" problem has been presented, with tiny sentences being analyzed.

By Aneesh B

Jun 18, 2022

Week 4 Lab Assignment could be made a little bit tougher. The backpropagation derivation of W1, W2, b1 and b2 could have an optional reading for the interested reader. Otherwise, amazing course!

By Osama A O

Oct 7, 2020

Good course, but the lecture notes in week 2 can be much more improved. Understanding Viterbi algorithm without visuals and animations was very difficult. Apart from that, great course!

By Ramprakash V

Aug 19, 2020

The course is exceptional in its own way by bringing people to the understanding of probabilistic models. Crisp & Clear. But one need to explore & practise more to gain expertise.

By Fabio

Nov 14, 2022

I liked to learn about Word2vec in the week 4, using Continuous Bags of Words, step by step. It helped me a lot to understand how to transform words in numeric vectors.

By Cheng J

Sep 9, 2020

The Viterbi algorithm introduction is a bit hard for us to follow. Probably some writings may help guiding through each steps.

By Leo H

Mar 4, 2024

Feels a bit outdated in times of big foundational models, but some concepts still useful to gain fundamental understanding.

By Hernan J

Nov 4, 2020

Esta especialización junto con la de Deep Learning se complementan y es son más claros los conceptos y prácticas, gracias!

By Daniel R

Sep 1, 2022

Quite a lot of the coding is done for you, but otherwise a great course to give an overview of NLP with prob. models.

By Daniel W

Mar 25, 2021

The tutor sometimes pass the slices too swiftly. I hope that he could wait 2-3 seconds after finishing speaking.

By Michael G A

Oct 21, 2023

The explanation for back prop and their equations in the assignment were very lack luster and almost misleading

By Sandeep V

Oct 2, 2020

Sone Quiz should also be there. Assignments can be solved by python knowledge an following the instruction