This 1-week, accelerated course builds upon previous courses in the Data Engineering on Google Cloud Platform specialization. Through a combination of video lectures, demonstrations, and hands-on labs, you'll learn how to create and manage computing clusters to run Hadoop, Spark, Pig and/or Hive jobs on Google Cloud Platform. You will also learn how to access various cloud storage options from their compute clusters and integrate Google’s machine learning capabilities into their analytics programs. In the hands-on labs, you will create and manage Dataproc Clusters using the Web Console and the CLI, and use cluster to run Spark and Pig jobs. You will then create iPython notebooks that integrate with BigQuery and storage and utilize Spark. Finally, you integrate the machine learning APIs into your data analysis. Pre-requisites • Google Cloud Platform Big Data & Machine Learning Fundamentals (or equivalent experience) • Some knowledge of Python COMPLETION CHALLENGE Complete any GCP specialization from November 5 - November 30, 2019 for an opportunity to receive a GCP t-shirt (while supplies last). Check Discussion Forums for details.