Vanishing Gradients with RNNs

video-placeholder
Loading...
Lehrplan anzeigen

Kompetenzen, die Sie erwerben

Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models

Bewertungen

4.8 (27,493 Bewertungen)

  • 5 stars
    83,65 %
  • 4 stars
    13,04 %
  • 3 stars
    2,54 %
  • 2 stars
    0,47 %
  • 1 star
    0,28 %

JY

29. Okt. 2018

Filled StarFilled StarFilled StarFilled StarFilled Star

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

SD

27. Sep. 2018

Filled StarFilled StarFilled StarFilled StarFilled Star

Great hands on instruction on how RNNs work and how they are used to solve real problems. It was particularly useful to use Conv1D, Bidirectional and Attention layers into RNNs and see how they work.

Aus der Unterrichtseinheit

Recurrent Neural Networks

Discover recurrent neural networks, a type of model that performs extremely well on temporal data, and several of its variants, including LSTMs, GRUs and Bidirectional RNNs,

Unterrichtet von

  • Placeholder

    Andrew Ng

    Instructor

  • Placeholder

    Kian Katanforoosh

    Senior Curriculum Developer

  • Placeholder

    Younes Bensouda Mourri

    Curriculum developer

Durchsuchen Sie unseren Katalog

Melden Sie sich kostenlos an und erhalten Sie individuelle Empfehlungen, Aktualisierungen und Angebote.