Chevron Left
Zurück zu Natural Language Processing with Attention Models

Bewertung und Feedback des Lernenden für Natural Language Processing with Attention Models von

709 Bewertungen
174 Bewertungen

Über den Kurs

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....


4. Okt. 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

22. Juni 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filtern nach:

26 - 50 von 175 Bewertungen für Natural Language Processing with Attention Models

von RKX

24. Sep. 2021

It would be better using TensorFlow as an implementation tool of these cutting edge algorithms for its popularity both in academia and industry.

von Evan P

22. März 2021

Dr. Ng's Deep Learning specialization is so good: 5 stars. For me, this course was not nearly as good as the courses in that specialization. I felt like I could have just read the papers on BERT, GPT-2, T5, and the Reformer, and would have learned the same amount. The one exception was the lecture video on the history of Transformers (the evolution from ELMO->BERT->T5, etc.). Also the ungraded Reformer labs; those were good. But I personally didn't get very much value out of all the other lectures and labs.

von Jean-Luc B

8. Nov. 2020

Maybe my fault but at some point in these courses I got lost in the logic and the whys of the networks constructions. I managed the assignments because for some to pass you only need to know how to copy and paste.

But I reckon the great value of the material, I think I'll need to revisit and spend more time on the optional readings.

And still overall a great specialization, thanks to all the persons involved in these courses !

von Israel T

7. Okt. 2020

Very educational! I learned a lot about the different NLP models. However, it seems like week 3 and week 4 were rushed. Also, some of the items (e.g. what each layers does and why do we need that layer) were not properly explained. Other than that, this is a good course to have a general overview on some of the state of the art NLP models.

von Mark L

2. Okt. 2021

(1) Please consider switching from Trax to Tensorflow. (2) The concepts of Transformers, particularly some explanation of why Q, K and V are called such, would be helpful to go over in more detail. (3) Not a problem of the course, but it would be helpful if the Trax documentation were more complete.

von Felix M

11. Apr. 2021

The classes originally taught by Andrew were for me much better. Many of the explanations in this course were not very clear and superficial as I see it.

von Tianpei X

1. Nov. 2020

the homework is way too simplified esp. in week 3 and week 4. My impression is that the ungraded lab was actually the real homework but was put aside to allow more people to pass. That is not a good compromise.

von Valerio G

24. März 2021

I'm very disappointed with the whole NLP specialization in general, but this course was icing on the cake.

The course treats advanced and state-of-art techniques in NLP with neural neutworks, but the theoretical lectures are confusing and imprecise. The framework programming assignments are totally useless, since the user is asked to implement the network architectures discussed in the lectures using a "fill the dots" approach with a very restrictive starter structure. In my personal experience, this yielded a close-to-zero learning outcome, but a lot of frustration in trying to get around some bugs in the auto-grading system, by desperately browsing in the posts from the learners community.

I came here after the very nice Deep Learning Specialization held by Andrew Ng and wasn't expecting this.

von Siddharth S

19. Sep. 2021

TRAX absolutely made it super hard to learn and follow.

If it was explained using Tensorflow or Pytorch it would have been very beneficial.

von Rabin A

19. Apr. 2021

The course was pretty good. It introduced me to the state-of-the-art algorithms and techniques needed to have a sound understanding of NLP. One thing I didn't like about the teaching method in the whole specialization is that Younes was the one teaching the course content to us but Łukasz talked as if it was he giving some of the lectures, although we could clearly find out it's Younes from his voice. Thanks especially to Younes for doing all the hard work for the specialization. You deserve a 5 star.

von Dustin Z

17. Dez. 2020

A very good and detailed course. Definitely the most challenging course I have taken by Gives a good overview of Transformers, the current cutting-edge of NLP models. Also, provides great insight into Trax, Google Brain's ML framework, which was helpful in understanding how deep learning frameworks are built. One of the teachers is one of the authors of Trax!

von Ganesh s m

10. Okt. 2020

Every week's assignment brings a new challenge and it was fun to complete the assignments. Course Instructors explain concepts very well. This course teaches you from the beginner level to a professional level. Covers every topic related to NLP. I enjoyed learning NLP with I would like to thank for making this course.

von Huu M T H

30. Sep. 2020

Good course in overall. The last two weeks' assignment is a little bit too light. The instructor could introduce more about loading pretrained models and fine-tune them as it is a popular practice nowadays for small companies with limited resources (data/computation). Introduction to "easy-to-use" framework such as huggingface is highly recommended.

von Rajendra A

30. Dez. 2020

This specialization covers from NLP basics to the advance models currently being used. All the programming assignments, contents and sessions were thoughtful. Exposure to Trax library and learning experience was really excellent. Thanks to the entire team of this specialization and coursera team.

von Peter T

1. Jan. 2022

​The final weeks of this course, especially, introduce cutting edge NLP models and practices, such as T5, Huggingface and Reformer. This entire course was comprehensive in breadth. Highly recommended but you should be prepared to put 10x more hours into it than the Coursera estimates.

von Long L

18. Nov. 2020

Thank you Coursera and the DeepLearning.AI team. The moment I set foot on this journey I did not think I would love NLP so much. The course is very informative: it teaches NLP from the very first naive algorithm to the State-of-the-art models today.

von Bharathi k N

12. Okt. 2020

The course is so good and well presented. I really enjoyed the whole specialization. Thank you for this amazing course and the whole specialization which that me a lot. Thank you Andrew NG and team for this amazing specialization.

von Alan K F G

21. Okt. 2020

I learnt a lot about Transformers and Reformers which belong to the most advenced models for NLP tasks. The instructors were fully prepared though I'd prefer to see more animations in following courses. Thank you so much for spreading knowledge!

von Muhammad T W

12. Juni 2021

This course has helped me a lot in developing my NLP skills and now I am confident that I can solve NLP problems easily because both the instructors Younes and Luckerz has thought this course in a way that it can be absorbed in any NLP problem.

von Patrick A

26. Nov. 2020

An excellent course that covers research that was published about two months early.

It doen't get more cutting edge than that, and the technology (reversible residual layers) is immediately applicable and a very powerful enabler.

Thanks a lot!

von vadim m

17. Okt. 2020

An amazing level of breadth and depth of the material presented. State of the art techniques are exemplified via carefully crafted lab assignments with sufficient hints for students to be able to comprehend hard technical concepts.

von Jim F

28. Feb. 2021

Thanks for setting out to do the impossible and creating this set of courses. You have opened a doorway to understanding where the state of the art is. The rest is up to me. That's the purpose of education.

von Simin F

27. Nov. 2020

Helpful and Interesting! This course leads me gradually understand how transformer works and being optimized along with several models without much confusions. Great thanks for the Team!!

von lonnie

23. Juni 2021

T​his course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.


21. Nov. 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.