Chevron Left
Zurück zu Natural Language Processing with Attention Models

Bewertung und Feedback des Lernenden für Natural Language Processing with Attention Models von deeplearning.ai

4.3
Sterne
712 Bewertungen
175 Bewertungen

Über den Kurs

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top-Bewertungen

JH
4. Okt. 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL
22. Juni 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filtern nach:

126 - 150 von 175 Bewertungen für Natural Language Processing with Attention Models

von CLAUDIA R R

7. Sep. 2021

It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.

von Anand K

15. Okt. 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

von Moustafa S

3. Okt. 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

von Mohan N

28. März 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

von RAHUL J

29. Sep. 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

von Vaseekaran V

20. Sep. 2021

I​t's a really good course to learn and get introduced on the attention models in NLP.

von David M

25. Okt. 2020

An amazing experience throughout the state-of-art NLP models

von Shaojuan L

18. Dez. 2020

The programming assignment is too simple

von Fatih T

4. Feb. 2021

great explanation of the topic I guess!

von Sreang R

22. Dez. 2020

Awesome course

von Amit J

2. Jan. 2021

Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course. [1] Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea. [2] Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation. [4] Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers. [5] Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.

Overall a disappointment given the quality of other courses available from Coursera.

von Laurence G

11. Apr. 2021

Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs

Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.

Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.

von Christine D

22. Jan. 2021

Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.

The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".

I preferred and recommend the first two courses in this NLP-specialization.

von Rishabh S

18. Sep. 2021

T​he course is very research oriented and not very useful for data science practitioners. No time was spent on explaining how transformers can be used for NLP tasks using a small domain or company specific corpus through transfer learning. I'm not planning to develop the next blockbuster NN architecture for NLP and so the intricate details of how transformer and reformer works seemed like an overkill. Lastly, using Trax instead of the more production ready frameworks like Tensorflow also made it feel very research focussed.

von Azriel G

20. Nov. 2020

The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt

von Thomas H

21. Mai 2021

While the course succeeds in getting the most important points across, the quality of both the video lectures and the assignments is rather disappointing. The more detailed intricacies of attention and transformer models are explained poorly without providing any intuition on why these models are structured the way they are. Especially the lectures on current state-of-the-art models like BERT, GPT and T5 were all over the place and didn't explain these models well at all.

von Kota M

23. Aug. 2021

This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.

von Zhuo Q L

4. Juli 2021

It is exciting to learn about the state of the art approach for NLP, but as the last course of the specialization, one can feel that the quality/level of details of descriptions just dropped significantly. I like how the course introduces useful things like SentencePiece, BPE, and interesting applications, but some of them felt abrupt and wasn't elaborated.

von Dan H

5. Apr. 2021

Pros: Good selection of state of the art models (as of 2020). Also great lab exercises.

Cons: The video lectures and readings are not very helpful. Explanations about the more tricky parts of the models and training processes are vague and ambiguous (and some times kind of wrong?). You can find more detailed and easier to understand lectures on Youtube.

von dmin d

7. Jan. 2021

Have to say, the instructor didn't explain the concept well. A lot of explanation doesn't make sense, or just give the final logic and skip all the details. I need to search on youtube or google to understand the details and concept.

But, it covers state-of-art models for NLP. It's a good starting point and helped save time.

von Oleksandr P

4. Apr. 2021

Although this course gives you understanding about the cutting edge NLP models it lacks details. It is hard to understand a structure of the complex NLP model during the few minute video. This course should have step by step explanations in the bigger number of lectures or increase their duration.

von Nunzio V

7. Apr. 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

von Семин А С

9. Aug. 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

von Michel M

9. Feb. 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

von Zeev K

24. Okt. 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.