AWS - Big Data on AWS

3 days

Course Description

Big Data on AWS training introduces you to cloud-based big data solutions and Amazon Elastic MapReduce (EMR), the AWS big data platform. In this course, we show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Pig and Hive. We also teach you how to create big data environments, work with Amazon DynamoDB and Amazon Redshift, understand the benefits of Amazon Kinesis, and leverage best practices to design big data environments for security and cost-effectiveness.

This course leads to AWS Certified Data Analytics - Specialty Certification.

Note: Lab time available only for class duration & not beyond. Additional lab charges apply for “repeat students”.

Course Objectives

  • Apache Hadoop in the context of Amazon EMR
  • The architecture of an Amazon EMR cluster
  • Launch an Amazon EMR cluster using an appropriate Amazon Machine Image and Amazon EC2 instance types
  • Appropriate AWS data storage options for use with Amazon EMR
  • Ingesting, transferring, and compressing data for use with Amazon EMR
  • Use common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
  • Work with Amazon Redshift to implement a big data solution
  • Leverage big data visualization software
  • Appropriate security options for Amazon EMR and your data
  • Perform in-memory data analysis with Spark and Shark on Amazon EMR
  • Options to manage your Amazon EMR environment cost-effectively
  • Benefits of using Amazon Kinesis for big data

Who Should Attend?

This course is intended for: -Individuals responsible for designing and implementing big data solutions, namely Solutions Architects and SysOps Administrators

  • Data Scientists and Data Analysts interested in learning about big data solutions on AWS

Prerequisites

Required

  • Familiarity with big data technologies, including Apache Hadoop and HDFS
  • Knowledge of big data technologies such as Pig, Hive, and MapReduce is helpful but not required
  • Working knowledge of core AWS services and public cloud implementation
  • Students should complete the AWS Essentials course or have equivalent experience
  • Basic understanding of data warehousing, relational database systems, and database design
  • AWS Technical Essentials

Recommended

  • CCC Big Data Foundation (BDF)

Course Outline:

Note: The curricula below comprise activities typically covered in a class at this skill level. The instructor may, at his/her discretion, adjust the lesson plan to meet the needs of the class based on regional location and/or language in which the class is served.

  1. Overview of Big Data
  2. Data Ingestion, Transfer, and Compression
  3. AWS Data Storage Options
  4. Using DynamoDB with Amazon EMR
  5. Using Kinesis for Near Real-Time Big Data Processing
  6. Introduction to Apache Hadoop and Amazon EMR
  7. Using Amazon Elastic MapReduce
  8. The Hadoop Ecosystem
  9. Using Hive for Advertising Analytics
  10. Using Streaming for Life Sciences Analytics
  11. Using Hue with Amazon EMR
  12. Running Pig Scripts with Hue on Amazon EMR
  13. Spark on Amazon EMR
  14. Running Spark and Spark SQL Interactively on Amazon EMR
  15. Using Spark and Spark SQL for In-Memory Analytics
  16. Managing Amazon EMR Costs
  17. Securing your Amazon EMR Deployments
  18. Data Warehouses and Columnar Datastores
  19. Introduction to Amazon Redshift
  20. Optimizing Your Amazon Redshift Environment
  21. The Big Data Ecosystem on AWS
  22. Visualizing and Orchestrating Big Data
  23. Using Tibco Spotfire to Visualize Big