Let us help you find the training program you are looking for.

If you can't find what you are looking for, contact us, we'll help you find it. We have over 800 training programs to choose from.

Apache Spark

  • Course Code: Big Data - Apache Spark
  • Course Dates: Contact us to schedule.
  • Course Category: Big Data & Data Science Duration: 2 Days Audience: This course is geared for those who wants a practical guide for solving complex data processing challenges by applying the best optimizations techniques in Apache Spark.

Course Snapshot 

  • Duration: 2 days 
  • Skill-level: Foundation-level Apache Spark skills for Intermediate skilled team members. This is not a basic class. 
  • Targeted Audience: This course is geared for those who wants a practical guide for solving complex data processing challenges by applying the best optimizations techniques in Apache Spark. 
  • Hands-on Learning: This course is approximately 50% hands-on lab to 50% lecture ratio, combining engaging lecture, demos, group activities and discussions with machine-based student labs and exercises. Student machines are required. 
  • Delivery Format: This course is available for onsite private classroom presentation. 
  • Customizable: This course may be tailored to target your specific training skills objectives, tools of choice and learning goals. 

Apache Spark is a flexible framework that allows processing of batch and real-time data. Its unified engine has made it quite popular for big data use cases. This course will help you to get started with Apache Spark 2.0 and write big data applications for a variety of use cases. It will also introduce you to Apache Spark – one of the most popular Big Data processing frameworks. Although this course is intended to help you get started with Apache Spark, but it also focuses on explaining the core concepts. This practical guide provides a quick start to the Spark 2.0 architecture and its components. It teaches you how to set up Spark on your local machine. As we move ahead, you will be introduced to resilient distributed datasets (RDDs) and DataFrame APIs, and their corresponding transformations and actions. Then, we move on to the life cycle of a Spark application and learn about the techniques used to debug slow-running applications. You will also go through Spark’s built-in modules for SQL, streaming, machine learning, and graph analysis. Finally, the course will lay out the best practices and optimization techniques that are key for writing efficient Spark applications. By the end of this course, you will have a sound fundamental understanding of the Apache Spark framework and you will be able to write and optimize Spark applications. 

Working in a hands-on learning environment, led by our Apache Spark expert instructor, students will learn about and explore: 

  • Learn about the core concepts and the latest developments in Apache Spark 
  • Master writing efficient big data applications with Spark’s built-in modules for SQL, Streaming, Machine Learning and Graph analysis 
  • Get introduced to a variety of optimizations based on the actual experience 

Topics Covered: This is a high-level list of topics covered in this course. Please see the detailed Agenda below 

  • Learn core concepts such as RDDs, DataFrames, transformations, and more 
  • Set up a Spark development environment 
  • Choose the right APIs for your applications 
  • Understand Spark’s architecture and the execution flow of a Spark application 
  • Explore built-in modules for SQL, streaming, ML, and graph analysis 
  • Optimize your Spark job for better performance 

Audience & Pre-Requisites 

This course is geared for attendees who want a practical guide for solving complex data processing challenges by applying the best optimizations techniques in Apache Spark. 

Pre-Requisites:  Students should have  

  • Basic to Intermediate IT Skills.  
  • no previous exposure to large-scale data analysis or NoSQL tools.  
  • Familiarity with traditional databases is helpful. 

Course Agenda / Topics 

  1. Introduction to Apache Spark 
  • Introduction to Apache Spark 
  • What is Spark? 
  • Spark architecture overview 
  • Spark language APIs 
  • Spark components 
  • Making the most of Hadoop and Spark 
  1. Apache Spark Installation 
  • Apache Spark Installation 
  • AWS elastic compute cloud (EC2) 
  • Configuring Spark 
  1. Spark RDD 
  • Spark RDD 
  • What is an RDD? 
  • Programming using RDDs 
  • Transformations and actions 
  • Types of RDDs 
  • Caching and checkpointing 
  • Understanding partitions  
  • Drawbacks of using RDDs 
  1. Spark DataFrame and Dataset 
  • Spark DataFrame and Dataset 
  • DataFrames 
  • Datasets 
  1. Spark Architecture and Application Execution Flow 
  • Spark Architecture and Application Execution Flow 
  • A sample application 
  • Application execution modes 
  • Application monitoring 
  1. Spark SQL 
  • Spark SQL 
  • Spark SQL 
  1. Spark Streaming, Machine Learning, and Graph Analysis 
  • Spark Streaming, Machine Learning, and Graph Analysis 
  • Spark Streaming 
  • Machine learning 
  • Graph processing 
  1. Spark Optimizations 
  • Spark Optimizations 
  • Cluster-level optimizations 
  • Application optimizations 
  •  
View All Courses

    Course Inquiry

    Fill in the details below and we will get back to you as quickly as we can.

    Interested in any of these related courses?