Deep-Dive into Tensorflow Activation Functions

von
Coursera Project Network
In diesem angeleitetes Projekt werden Sie:

Learn when, where, why and how to use different activation functions and for which situations

Code examples of each activation function from scratch in Python

Clock2 hours
IntermediateMittel
CloudKein Download erforderlich
VideoVideo auf geteiltem Bildschirm
Comment DotsEnglisch
LaptopNur Desktop

You've learned how to use Tensorflow. You've learned the important functions, how to design and implement sequential and functional models, and have completed several test projects. What's next? It's time to take a deep dive into activation functions, the essential function of every node and layer of a neural network, deciding whether to fire or not to fire, and adding an element of non-linearity (in most cases). In this 2 hour course-based project, you will join me in a deep-dive into an exhaustive list of activation functions usable in Tensorflow and other frameworks. I will explain the working details of each activation function, describe the differences between each and their pros and cons, and I will demonstrate each function being used, both from scratch and within Tensorflow. Join me and boost your AI & machine learning knowledge, while also receiving a certificate to boost your resume in the process! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Kompetenzen, die Sie erwerben werden

  • Neural Network Activation Functions
  • Deep Learning
  • Artificial Neural Network
  • Python Programming
  • Tensorflow

Schritt für Schritt lernen

In einem Video, das auf einer Hälfte Ihres Arbeitsbereichs abgespielt wird, führt Sie Ihr Dozent durch diese Schritte:

  1. Review the Activation Functions, Their Properties & the Principle of Nonlinearity

  2. Implementing Linear and Binary Step Activations

  3. Implementing Ridge-based Activation Functions (ReLu family)

  4. Implementing Variations of Relu & the Swish Family of Non-Monotonic Activations

  5. Implementing Radial-based Activation Functions (RBF family)

Ablauf angeleiteter Projekte

Ihr Arbeitsbereich ist ein Cloud-Desktop direkt in Ihrem Browser, kein Download erforderlich

Ihr Dozent leitet Sie in einem Video mit geteiltem Bildschirm Schritt für Schritt an.

Häufig gestellte Fragen

Häufig gestellte Fragen

Haben Sie weitere Fragen? Besuchen Sie das Learner Help Center.