Let us help you find the training program you are looking for.

If you can't find what you are looking for, contact us, we'll help you find it. We have over 800 training programs to choose from.

banner-img

Course Skill Level:

Foundational to Intermediate

Course Duration:

2 day/s

  • Course Delivery Format:

    Live, instructor-led.

  • Course Category:

    Big Data & Data Science

  • Course Code:

    STRDATL21E09

Who should attend & recommended skills:

Developers basic IT skills and familiarity with relational database concepts

Who should attend & recommended skills

  • Developers familiar with relational database concepts who want to learn the concepts and requirements of streaming and real-time data systems.
  • Skill-level: Foundation-level Streaming Data skills for Intermediate skilled team members. This is not a basic class.
  • IT skills: Basic to Intermediate (1-5 years’ experience)
  • Streaming: No experience required
  • Real-time applications: No experience required

About this course

Streaming Data is an idea-rich tutorial that teaches you to think about efficiently interacting with fast-flowing data. Through relevant examples and illustrated use cases, you’ll explore designs for applications that read, analyze, share, and store streaming data. Along the way, you’ll discover the roles of key technologies like Spark, Storm, Kafka, Flink, RabbitMQ, and more. This course offers the perfect balance between big-picture thinking and implementation details.

Skills acquired & topics covered

  • Introduction to the concepts and requirements of streaming and real-time data systems.
  • How to efficiently interact with fast-flowing data.
  • The right way to collect real-time data
  • Architecting a streaming pipeline
  • Analyzing the data
  • Which technologies to use and when

Course breakdown / modules

  • What is a real-time system?
  • Differences between real-time and streaming systems
  • The architectural blueprint
  • Security for streaming systems
  • How do we scale?

  • Common interaction patterns
  • Scaling the interaction patterns
  • Fault tolerance
  • A dose of reality

  • Why we need a message queuing tier
  • Core concepts
  • Security
  • Fault tolerance
  • Applying the core concepts to business problems

  • Understanding in-flight data analysis
  • Distributed stream-processing architecture
  • Key features of stream-processing frameworks

  • Accepting constraints and relaxing
  • Thinking about time
  • Summarization techniques

  • When you need long-term storage
  • Keeping it in-memory
  • Use case exercises

  • Communications patterns
  • Protocols to use to send data to the client
  • Filtering the stream
  • Use case: building a Meetup RSVP streaming API

  • The core concepts
  • Making it real: SuperMediaMarkets
  • Introducing the web client
  • The move toward a query language

  • The collection tier
  • Message queuing tier
  • Analysis tier
  • In-memory data store
  • Data access tier