Chevron Left
Zurück zu Natural Language Processing with Attention Models

Bewertung und Feedback des Lernenden für Natural Language Processing with Attention Models von

774 Bewertungen

Über den Kurs

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....



4. Okt. 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks


20. Nov. 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filtern nach:

26 - 50 von 194 Bewertungen für Natural Language Processing with Attention Models

von Israel T

7. Okt. 2020

Very educational! I learned a lot about the different NLP models. However, it seems like week 3 and week 4 were rushed. Also, some of the items (e.g. what each layers does and why do we need that layer) were not properly explained. Other than that, this is a good course to have a general overview on some of the state of the art NLP models.

von Mark L

2. Okt. 2021

(1) Please consider switching from Trax to Tensorflow. (2) The concepts of Transformers, particularly some explanation of why Q, K and V are called such, would be helpful to go over in more detail. (3) Not a problem of the course, but it would be helpful if the Trax documentation were more complete.

von Felix M

11. Apr. 2021

The classes originally taught by Andrew were for me much better. Many of the explanations in this course were not very clear and superficial as I see it.

von Haoyu R

2. Okt. 2020

Not as details as enough. The quality of the course is very good at the start but decreases as the topics go deeper.

von Kévin S

15. Feb. 2022

L​ook like an commercial AD for Trax. I don't know if I will be able to re-implement this in another framework.

von Darren

7. Feb. 2022

The general content is good, but there are so many insonsistencies and missing pieces of information in the material. Terms are poorly defined and used inconsistently. Lots of information about "why" certain things are the way they are in the programming assignments is missing -- you just "do it" without understanding it. Also, the instructors have abandoned the course forums. Lots of questions about content in the discussion forums, but none of the content creators are helping answer the questions. We're just left to fend for ourselves. Not worth the money. Just watch the videos.

von Valerio G

24. März 2021

I'm very disappointed with the whole NLP specialization in general, but this course was icing on the cake.

The course treats advanced and state-of-art techniques in NLP with neural neutworks, but the theoretical lectures are confusing and imprecise. The framework programming assignments are totally useless, since the user is asked to implement the network architectures discussed in the lectures using a "fill the dots" approach with a very restrictive starter structure. In my personal experience, this yielded a close-to-zero learning outcome, but a lot of frustration in trying to get around some bugs in the auto-grading system, by desperately browsing in the posts from the learners community.

I came here after the very nice Deep Learning Specialization held by Andrew Ng and wasn't expecting this.

von Yuri C

6. Jan. 2021

The last course in the NLP specialization is intense! Already in the first week the learner is put through its tensor algebra baptism and it goes even deeper while building the locality-sensite hashing inner workings. I am very grateful to the team to have put so much effort in teaching us how attention works and how to improve it in building the Reformer model. The opportunity to get this material from some of the developers of the model is priceless! Thank you for that! Surely, in everyday NLP one uses directly the layers provided by Trax mostly. But the understanding about the drawbacks and the ideas behind these models is indeed the unique selling proposition of this whole course. The provided infographics are deeply helpful for understanding what goes on with the tensors inside the models and the instructors do the best to introduce those ideas throughout the course. I was also *very* impressed to see how much up-to-date all the material of this latest course is! Some of the papers about the models were put in arXiv 1-2 years ago. This is by far very hard to beat in any massive open online course! Thank you very much for providing this for the community at a such an accessible price tag. I will be eagerly waiting for a continuation of this specialization as Advanced NLP!


21. Nov. 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

von satish b

1. Jan. 2021

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

von Jonathan M

16. Nov. 2020

The course was wonderful, full of updated content and explained in a really good way. Good work!

von Akash M

26. Sep. 2020

Outstanding Course. The course was rigorous

von Simon P

6. Dez. 2020

The course could have been expanded to an entire specialization. There's a little too much information and the first two assignments are disproportionately long and hard compared with the last two. It is cutting edge material though, and well worth it.

Slight annoyance at the script reading meaning the videos lack a natural flow and you end up with nonsense sentences like "now we multiply double-uwe sub en superscript dee by kay sub eye superscript jay to get vee sub eye". Variables such as X_i should be referred to by what they actually represent and not the algebraic representation, because this is not how the brain processes them when they are read from a page.

von Dave J

3. Mai 2021

The content is interesting and current, citing some 2020 papers. I was disappointed by the amount of lecture material - around 40-45 minutes per week in weeks 1-3 and only 20 minutes in week 4, plus two Heroes of NLP interviews. The lectures have the feel of reading from a script rather than engaging with the learner. They're not bad but there's room for improvement. Explanations are usually adequate but some areas could have been explained more clearly.

Programming assignments worked smoothly in my experience, though not particularly challenging: they're largely "painting by numbers".

von RKX

24. Sep. 2021

It would be better using TensorFlow as an implementation tool of these cutting edge algorithms for its popularity both in academia and industry.

von Toon P

10. Juli 2022

sometimes the videos were a bit short. Instead of making 3 3 min videos with part I II and III with each and intro and outro of a minute, just make one solid video explaining the concepts. Some weeks only had a duration of 30 minutes of theory with 15 minutes of useless intros and outros. I felt like these minutes could used to go a bit deeper into the material. Furthermore, whats op with the names of the weeks? A week is named Q&A but actually explains al the Transformer models and Q&A is like a small application of it?? Last but not least, you could fill in the assignments without even reading or understanding the concept because it is just an auto complete coding exercise. Normally I always better understand the theory because of the assignments, but now I felt the assignments were sometimes useless and didn't really teach me to implement a solid ml model. It doesn't really match up with what you will have to implement if you are working at a company, I think. The assignments could be a bit more of indepented coding with a non auto-complete explanation. On the other side, I think this is ofcourse difficult to correct and thus to scale up the course.

von Amit J

2. Jan. 2021

Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course. [1] Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea. [2] Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation. [4] Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers. [5] Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.

Overall a disappointment given the quality of other courses available from Coursera.

von Laurence G

11. Apr. 2021

Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs

Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.

Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.

von Christine D

22. Jan. 2021

Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.

The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".

I preferred and recommend the first two courses in this NLP-specialization.

von Thomas H

21. Mai 2021

While the course succeeds in getting the most important points across, the quality of both the video lectures and the assignments is rather disappointing. The more detailed intricacies of attention and transformer models are explained poorly without providing any intuition on why these models are structured the way they are. Especially the lectures on current state-of-the-art models like BERT, GPT and T5 were all over the place and didn't explain these models well at all.

von Zhuo Q L

4. Juli 2021

It is exciting to learn about the state of the art approach for NLP, but as the last course of the specialization, one can feel that the quality/level of details of descriptions just dropped significantly. I like how the course introduces useful things like SentencePiece, BPE, and interesting applications, but some of them felt abrupt and wasn't elaborated.

von Dan H

5. Apr. 2021

Pros: Good selection of state of the art models (as of 2020). Also great lab exercises.

Cons: The video lectures and readings are not very helpful. Explanations about the more tricky parts of the models and training processes are vague and ambiguous (and some times kind of wrong?). You can find more detailed and easier to understand lectures on Youtube.

von dmin d

7. Jan. 2021

Have to say, the instructor didn't explain the concept well. A lot of explanation doesn't make sense, or just give the final logic and skip all the details. I need to search on youtube or google to understand the details and concept.

But, it covers state-of-art models for NLP. It's a good starting point and helped save time.

von Oleksandr P

4. Apr. 2021

Although this course gives you understanding about the cutting edge NLP models it lacks details. It is hard to understand a structure of the complex NLP model during the few minute video. This course should have step by step explanations in the bigger number of lectures or increase their duration.

von Damian S

24. Feb. 2022

C​ourse content is fantanstic, but assignments are ridiculous--they test how well you can read directions, but not how well you understand the content.