Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service
In this course, you will learn how you can use the Amazon Kinesis Data Analytics service to process streaming data using both the Apache Flink runtime and the SQL runtime. You will integrate your streaming applications with Kinesis Data Streams, Kinesis Data Firehose Delivery streams, and Amazon’s S3.
What you'll learn
Kinesis Data Analytics is a service to transform and analyze streaming data in real-time with Apache Flink and SQL using serverless technologies. In this course, Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service, you will learn that Kinesis Data Analytics is part of the Kinesis streaming platform along with Kinesis Data Streams, Kinesis Data Firehose, and Kinesis Video streams.
First, you will get introduced to the Kinesis Data Analytics service for processing and analyzing streams. You will explore the runtimes available that you can use to process your data which includes the Apache Flink runtime, the SQL runtime, and the Apache Beam runtime. You will then deploy a streaming application using the AWS command-line interface. This will involve setting up the correct roles and policies for your application to access the resources that it needs.
Next, you will learn how you can deploy a Kinesis Analytics application using the web console. You will configure your streaming application to read from an enhanced fan-out consumer and write to Kinesis Firehose delivery streams. You will also explore using the Table API in Apache Flink to process streaming data.
Finally, you will deploy and run Kinesis Data Analytics applications using the SQL runtime. The SQL runtime allows you to run interactive SQL queries to processing input streams, you will learn how to create and use in-application streams and understand the purpose of the stream pump.
When you are finished with this course, you will have the skills and knowledge to create and deploy streaming applications on Kinesis Data Analytics and use connects to work with other AWS services as data sources and data sinks.
Table of contents
- Version Check 0m
- Prerequisites and Course Outline 2m
- Introducing Kinesis Data Analytics 6m
- The Apache Flink Runtime 2m
- Demo: Environment Set up on Local Machine 2m
- Demo: Creating Access ID and Secret Key for CLI 6m
- Kinesis Sources and Sinks 4m
- Demo: Creating a Kinesis Data Stream and S3 Bucket 5m
- Demo: Setting up an Apache Maven Project 3m
- Demo: Understanding the Stream Processing Code 5m
- Demo: Creating Policies and Roles for the Streaming Application 5m
- Demo: Creating and Running an App Using the CLI 3m
- Demo: Publishing Data to Kinesis Data Streams 2m
- Demo: Viewing Results and Stopping Application 3m
- Demo: Updating Application Using the Command Line 4m
- Demo: Running Application and Viewing Results 3m
- Kinesis Data Analytics Pricing 2m
- Demo: Creating an Application Using the Web Console 7m
- Demo: Starting and Stopping Applications Using the Web Console 5m
- Demo: Configure Your Application to Read from an EFO Consumer 7m
- Demo: Running an Application Reading from an EFO Consumer 2m
- Demo: Writing Processed Records to a Kinesis Data Stream 7m
- Demo: Sink Records to S3 Using a Kinesis Firehose Delivery Stream 7m
- Demo: Creating a Kinesis Firehose Delivery Stream for Direct PUT Operations 3m
- Demo: Writing Streaming Results to a Kinesis Firehose Delivery Stream 5m
- Demo: Transforming Delivery Stream Records Using an AWS Lambda Function 5m
- Demo: Stream Processing Using the Table API 9m
- The SQL Runtime 6m
- Demo: Publish Records in the JSON Format to the Kinesis Data Stream 4m
- Demo: Running SQL Queries to Process Streaming Data 8m
- Demo: Connecting a Destination to SQL Stream Processing 7m
- Demo: Editing Schema and Performing Windowing Operations 7m
- Demo: Working with CSV Data 5m
- Summary and Further Study 2m