Create Complex DAGs and Task Dependencies with Apache Airflow
Learn to build complex data pipelines with Apache Airflow. This course will teach you to create DAGs that read, validate, aggregate, and load data while managing task dependencies, using XComs for passing data between tasks.
What you'll learn
Apache Airflow excels in making it easy for you to manage complex data pipelines.
In this course, Create Complex DAGs and Task Dependencies with Apache Airflow, you’ll gain the ability to design and implement intricate workflows in Apache Airflow.
First, you’ll explore how to create a DAG that reads a CSV file from a local directory using a BashOperator to check if the file exists.
Next, you’ll discover how to perform data validation and aggregation using PythonOperators.
Finally, you’ll learn how to load the transformed data using a SQLiteOperator and set up task dependencies with the bitshift operator and set_upstream()/set_downstream() methods, controlling the flow of execution by passing data between tasks using XComs.
When you’re finished with this course, you’ll have the skills and knowledge of Apache Airflow needed to create and manage complex DAGs and task dependencies efficiently.
Table of contents
- Introduction and Version Check 1m
- Demo: Running a DAG with a BashOperator 4m
- Demo: Including the PythonOperator in the DAG 3m
- Demo: Setting up a SQLite Database and Connection 2m
- Demo: Reading, Processing, and Storing Data Using a Complex DAG 6m
- Demo: Using the Bit Shift Operator to Specify Dependencies 1m
- Demo: Using XComs to Pass Data between Tasks 4m