Chevron Left
Zurück zu Natural Language Processing with Probabilistic Models

Bewertung und Feedback des Lernenden für Natural Language Processing with Probabilistic Models von deeplearning.ai

4.7
Sterne
1,371 Bewertungen

Über den Kurs

In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top-Bewertungen

NM

12. Dez. 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).

HS

2. Dez. 2020

A neatly organized course introducing the students to basics of Processing text data, learning word embedding and most importantly on how to interpret the word embedding. Great Job!!

Filtern nach:

26 - 50 von 242 Bewertungen für Natural Language Processing with Probabilistic Models

von Simon P

27. Nov. 2020

Simply, it's not great.

The assignments are long and complex, with insufficient checks to debug when there's an error. The theory is poorly explained in both the videos and the labs. They clearly do not know who this course is aimed at; is it software engineers who want to better understand NLP? In which case they may find the assignments easy but the content lacking. Is it people with a basic understanding of NLP but want to take it further? In which case they will not get that, given that the concepts are only briefly discussed. Is it as a general introduction to NLP? In which case the coding aspect is at too high a level, you have to be familiar with all the little python tricks they know and to think in the same way they do. This leads to a frustrating experience.

By hovering the cursor over the names of contributors in the discussion forums, it is clear that most of the people who start this course never finish it. This level of attrition reflects poorly on the course creators.

von Slava S

11. Jan. 2022

I actually unpleasantly surprised by this course. Completing the whole DL specialization, I used to certain quality of courses there. This one, however, has a lot of bugs (like when I finished week 1, quiz was or may be still is from week 4), quizes are just repetitions of questions from lectures (even answers are the same).

Also, all of the weeks except week 4 is more about programming in python, than NLP. Even the last week assignment is more about writing basic backprop for simple shallow network, than working with embeddings.

Assignments are too easy, splitting every topic in 5 minutes video makes it easier to watch, however I think this format does not allows providing a lot of details on topic, so in the end I feel like this course is to shallow for a 4 week course.

von Rajaseharan R

28. Jan. 2022

A​ll quizes and assignments need to be revised and tested again. There seem to be multiple errors in the provided functions. There are also some quiz questions that don't make sense (there is no actual question.) There are also quizes that don't follow the weeks lectures. Some of the exercies need to have more help in the code comments section as they are not covered in the lecture, e.g. Week4 back_prop assignment.

von Dimitry I

14. Apr. 2021

Very superficial course, just like the rest in the specialization. Quizzes and assignments are a joke. Didn't want to give negative feedback at first, but now that I am doing course #4 in the specialization, which covers material I don't know much about (Attention), I've realized how bad these courses are. Very sad.

von Dave J

25. Jan. 2021

This course gradually ramps up the sophistication and interest from the first course in the NLP specialization.

Week 1: Autocorrect and Minimum Edit Distance is OK, nothing to write home about but gives you a sense of how a basic autocorrect mechanism works and introduces dynamic programming.

Week 2: Part of Speech Tagging explains Hidden Markov Models and the Viterbi algorithm pretty well. More of a sense here of learning something that will be a useful foundation.

Week 3: Autocomplete and Language Models explains what a language model is and builds a basic N-gram language model for autocompleting a sentence. Again, good foundations.

Week 4: Word embeddings with neural networks was for me the most interesting part of the specialization so far. The amount of lecture & lab content is considerably higher than in the previous weeks (which is a good thing in my view). The pros and cons of different ways of representing words as vectors are discussed, then different ways of generating word embeddings, from research papers dating from 2013 to 2018. The rest of the week focuses on implementing the continuous bag-of-words (CBOW) model for learning word embeddings with a shallow neural network. The whole process, from data preparation to building & training the network and extracting the embeddings, is explained & implemented in Python with NumPy, which is quite satisfying.

I found that the labs and assignments worked flawlessly. They are largely paint-by-numbers though, I would have liked to have been challenged and made to think more. The teaching is pretty good, though there's room for improvement. It tends to focus a little narrowly on the specific topic being covered and has the feel of reading a script. What I would like to see is more stepping back, thinking about and explaining the larger context of how the topic fits into current NLP and the student's learning journey; then engaging with the learner on this basis. I did feel this course was a little better than course 1 in that regard. Overall 4.5 stars but as there are no half stars, I'm going to let week 4 tip it up to 5.

von Yuri C

29. Dez. 2020

I enjoyed very much this second course in the NPL specialization! I must say, once again the balance between mathematical formalism and hands-on coding is just on point! This is also not easy to achieve. I quite enjoyed also the infographics about the word embedding model developed during the course. I have been reading blog posts and papers about the technique for some time now and I did not see any best explanation than the one in this course, chapeau! Nevertheless, there are also points of improvement to consider. One of my main concerns is that at the end of some assignments, there is very little discussion about the validity and usefulness of what we get at the end. Although in the motivation a lot is being put forward. For example, while building the autocomplete, there were a lot of time dedicated to motivating why is this useful and why one should learn, but at the very end of the week, when we finally build one with tweeter data, there is very little analysis on these results. This is a bit frustrating. Of course, one cannot build very useful models while in an assignment in a Jupyter notebook, nevertheless I am positive that you can find also here a good balance between analyzing the model's outputs and inquiring if indeed we achieved the goal we set at the beginning, and if no, why not, etc. Clearly, assignments are not research papers, but a bit more careful treatment on that end will make this course achieve its full potential. Keep up the good work!

von John Y

8. Dez. 2021

I really enjoyed the first two courses so far. I finished the DL Specialization before I took this NLP sequence. I'm glad that they tended to focus on the basic and essential concepts and not on the details too much like data cleaning although they do show you how these things are done too. But there are really a lot of things to digest. These courses can potentially fit into semesters in school but I think they successfully covered the important materials well. I especially enjoyed learning new things like hashing, dynamic programming, and Markov models. I found the labs to be very helpful because they helped divide the amount of material to digest into smaller portions. I also found them very helpful for the homework. Although some people didn't like the short videos, I actually liked them. They were mostly to the point and they were easier for me to review. People comment that they miss Andrew Ng's lecture. Unfortunately, I don't think Andrew can teach many more classes because he's busy with many things, although that would be awesome. However, I think that Younes did a great job of teaching. If I understood what was said and am able to do the quizzes and homework, then I'm good. Eventually, we're gonna have to work and think independently, anyways. I think the NLP courses tended to wean us towards that kind of work habit. Thanks Younes, Lukas, and the rest of the team for these awesome and wonderful classes! On to courses 3 and 4!

von Jonathan D

1. Mai 2022

R​eally good course. Covers many topics thoroguhly and succinctly. Very detailed implementations.

As feedback, I'd say i​t's often more difficult to follow and hit the nail in terms of the "rigidity" of the implementations than the concepts or notions being taught. It can feel like a course on how well do I understand the way this was designed and implmented than the ideas about natural language processing that underpin them. In the same breadth I'd say that these implementation methods are of course now in my toolbox and they have shaped my way of thinking.

von Leena P

5. Okt. 2020

I enjoyed Younes's teaching style and the specializations course structure of asking the quizzes in between the lectures. Also the ungraded programming notebooks give grounding and hints while allowing the graded work to be challenging and not completely obvious. Thanks to all the coursera team for sharing such deep knowledge so universally and easily. This knowledge sharing to all that seek it, is what I think is the hope for AI to stay relevant and not get lost in hype.

von SHASHI S M

25. Dez. 2020

I learned auto-correction using minimum edit distance algorithm, part of speech tagging by Viterbi algorithm, autocomplete using n-gram model, word embedding by applying Continuous Bag of words models. This week was a little tough and got great hands-on experience in NLP. Change my thought about NLP. This week was amazing. I work on nltk library, created a neural network to train a model for word embedding.

von Kshitiz D

7. Dez. 2021

Thank you Coursera for the financial aid! I was able to dive deeper into NLP, learn Autocorrect, Markov chains, Autocomplete, and word embeddings. The course was amazing throughout and practical assignments were nicely prepared to give one a complete overview of the things taught in a particular week. The thing I didn't like about the course was the repetition of problems in quizzes.

von Nishant M K

31. März 2021

Great course! As course 1 in this specialization, the REAL value lies in the Python/Jupyter notebooks that have a great mix of filling out key steps, along with very detailed and pointed descriptions. The lecture material is also very helpful in 'orienting' students and the coding assignments are where the actual learning happens. I would very much recommend this course!

von Nilesh G

29. Juli 2020

Greta Course, Nice Contents from basic to advance NLP...coverage of topics about word embedding,POS, Auto Completion was very good, assignments are challenging one but learn lot of things by hand son practice, hints are useful ..looking forward to complete remaining courses from this NLP specialization...Thanks to all instructors

von Cristopher F

8. Mai 2021

This is an exciting course. This course will not make you 100% ready for the real world, but it will give you directions that you can follow by yourself. I think the purpose of learning is not to be stuck somewhere while losing your mind. It's to build a foundation where you can find your own path.

von Yixuan Z

8. Jan. 2021

Most knowledge is new to me, but I really enjoy all the course content. I hope the autocomplete model could also instruct me how to predict new 2-5 or even more words based on N-gram models. The assignment of autocomplete only includes cases that predict the next 1 word.

von Soopramanien V

30. Sep. 2020

Great course to learn word embeddings, the instructors are excellent at explaining key concepts in a very clear and concise way and the accompanying assignments and labs serve their purpose in getting hands-on experience with implementing many of these NLP models

von Prantik B

29. Aug. 2020

The overall contents are very much interesting & also helpful. But week2 & 3 was a little bit harder for me. So I think, those contents can be little bit more informative so that anyone can go through the week's assignment more clearly.

von Usama I P

16. Mai 2021

Best Course for studying NLP. I started NLP as an experiment but these guys made me fell in love with NLP with such a clear and in depth explanation of everything that I feel so confident. Thank you for such an awesome course.

von Cecilia E G R P

8. Sep. 2020

Excellent course, the explanations given by Professor Younes were very clear. It allowed me to learn more about how natural language processing is done on the inside. Thank you to all the teachers for sharing their knowledge!

von Dustin Z

22. Aug. 2020

A good course that covers several important probabilistic models in NLP. Very good balance between challenging and easy. There are also some interesting software concepts like dynamic programming discussed. A fun course!

von A V A

25. Okt. 2020

Excellent and detailed description of how autocorrect and autocomplete work, as well as how POS are tagged based on Markov Models and how word embeddings are derived using a CBOW model.. thoroughly enjoyed this course!

von Christoph H

1. Juli 2020

This course goes hand in hand with the Daniel Jurafsky's introduction to NLP (Speech and Language Processing) and provides the knowhow for hands on implementation of simple but powerful probabilistic methods.

von vishal b

3. Mai 2021

Amazing course , just loved the way faculties has explained the complex concepts in such a easy manner and also hands-on labs and graded assignment are very helpful to review one's understanding of concepts

von Vladimir B

22. Jan. 2022

This class is one of the best on the subject. The prof is very knowledgeable and explains concepts very clearly.

The code in the assignments and lectures is super clean and structured incredibly well.

von Noah M

13. Dez. 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).