The Kafka Internals training course is designed to teach students about the publish/subscribe messaging system with many advanced configurations. Apache Kafka is an open-source stream processing platform used to provide a unified, high-throughput, low-latency system for handling real-time data feeds from a wide range of source systems.
The course begins with covering configurations, allowing students to discover brokers, consumers, producers, and topics. Next, students will build their own Kafka cluster using Linux Academy servers. The course will conclude by looking at applying their knowledge from the course to real-world scenarios like processing real-time stock price updates from an API and consolidating that into a data lake.
Purpose
|
Learn to use Apache Kafka as a distributed messaging system. |
Audience
|
Developers and developer teams looking to leverage Kafka architecture that know Java and basic Linux commands. |
Role
| Software Developer |
Skill Level
| Intermediate |
Style
| Workshops |
Duration
| 2 Days |
Related Technologies
| Java | Linux |
Productivity Objectives
- Understand the Kafka architecture and describe the roles and responsibilities of various Daemons
- Use producers, consumers, and brokers within Kafka
- Construct a streaming ETL pipeline using Kafka Connect
- Explain how and when to use Kafka developer APIs
- Perform real-time analytics using KSQL