Chevron Left
Zurück zu Sequence Models for Time Series and Natural Language Processing

Kursteilnehmer-Bewertung und -Feedback für Sequence Models for Time Series and Natural Language Processing von Google Cloud

4.4
Sterne
390 Bewertungen
51 Bewertungen

Über den Kurs

This course is an introduction to sequence models and their applications, including an overview of sequence model architectures and how to handle inputs of variable length. • Predict future values of a time-series • Classify free form text • Address time-series and text problems with recurrent neural networks • Choose between RNNs/LSTMs and simpler models • Train and reuse word embeddings in text problems You will get hands-on practice building and optimizing your own text classification and sequence models on a variety of public datasets in the labs we’ll work on together. Prerequisites: Basic SQL, familiarity with Python and TensorFlow...

Top-Bewertungen

PR

Aug 11, 2019

Great way to practically learn a lot of stuff. Sometimes, a lot of it starts to go over head. But, it is completely worth the learning curve! Definitely recommend it!

JW

Nov 11, 2018

Excellent course for those who know RNN. Knowledge is refreshed and techniques are consolidated. More details about Google ecosystem is introduced.

Filtern nach:

26 - 50 von 51 Bewertungen für Sequence Models for Time Series and Natural Language Processing

von Kaushik G

May 14, 2020

Awesome designed course

von Ilias P

Dec 04, 2018

I really loved it!

von Nguyễn V L

Apr 14, 2019

pretty great

von ELINGUI P U

Jan 27, 2019

Great one!

von Kamlesh C

Jun 23, 2020

Thank you

von Raja R G

Dec 11, 2018

Good

von Silviu M

Aug 28, 2019

The content is amazing and some of the implementations are really awesome! I am not a programmer but this course opened me the eyes to see how many business opportunities are there to use data for AI, in new products and services

von Harsh S

Jul 23, 2019

Though not focused on fundamental concepts, it's a great course to learn to use tensorflow and google cloud platform for sequence modelling.

von Marios N

Jun 10, 2019

Very helpful but needs more in depth detail how attention works, how encoder/decoder trains and makes predictions

von Bablesh S

Oct 18, 2019

Good Course with enough practical exercises to get some hands on experience.

von Печатнов Ю

Nov 22, 2018

First quiz is very bad

But totally the course is interesting and I like it :)

von Deepika

Jan 21, 2020

Felt somewhat advanced learning, which made tough to complete intime

von Hemant D K

Dec 01, 2018

Very informative, very much useful to my ongoing work on NLP.

von Nick S

May 29, 2020

Great course, but no more up to date with TF2

von Kumar U

May 18, 2020

Code explanation can be more elaborate

von borja v

Jul 22, 2019

Complex but interesting

von Matthias D

Jun 29, 2020

L

von Tiffany K

May 30, 2020

Not the best course. I think there are much better courses on Time series modeling here on coursera. The platform for the labs is also cumbersome.

von Noah M

Jul 06, 2020

Practice material (notebooks) didn't match the video lectures, but the course itself was good

von Soroush A

Jul 30, 2019

lecturer talks too fast and not easy to understand. This topic was one of my favorites.

von ssen-advanced

Nov 06, 2019

The chinese intonation and pronunciation was uncomfortable

von Rubens Z

Jun 21, 2020

I need more advanced Labs

von Ali S

May 11, 2020

The content was very general and the code specifics were not reviewed. It was like primarily Google Cloud advertising.

von José C L A

Jun 14, 2020

Excpected more. Also Katherine Zhao sloooow voice makes it tedious and boring

von Seth R

Jun 17, 2020

This is one of the poorly taught course. Week 2 is very difficult to follow the accent of the instructor. Also the notebook did not have the right package for the poetry problem and the instructor used different environment. The final Diaglogflow problem the concepts are not explained well. The course may be revised so that the important aspect of Tensor2Tensor is expressed properly.