Integrating Data in Microsoft Azure
With over 2.5 quintillion bytes of data generated each day, you need to rely on modern tools to efficiently use and analyze data. In this course, you will learn how to leverage the power of Azure to create data integration pipelines in the cloud.
What you'll learn
Data-driven decision making is the path to business success. In this course, Integrating Data in Microsoft Azure, you will gain foundational knowledge to integrate data utilizing the power of Microsoft Azure.
First, you will learn how to migrate data from on-premise and Amazon Web Services to Azure. Next, you will discover how to easily construct ETL processes and create data integration pipelines using Azure Data Factory.
Finally, you will explore how to create a real-time pipeline, to ingest and process real-time events sent by IoT devices using Azure EventHubs, Azure Stream Analytics, and Power BI.
When you’re finished with this course, you will have the skills and knowledge needed to create data integration pipelines using some of the great tools that are part of the Azure ecosystem.
Table of contents
- Introduction 1m
- Creating an Azure SQL Server and Database 3m
- Using Data Migration Assistance to Detect Compatibility Issues and Migrate Data 6m
- Introducing Azure Data Factory 2m
- Understanding Pipelines, Activities, Datasets, and Linked Services 4m
- Getting to Know Integration Runtimes 1m
- Identifying On-premises and Azure SQL Database Assets 1m
- Getting Familiar with the Azure Data Factory UI 4m
- Creating Your First Azure Data Factory Pipeline 2m
- Creating Your First Data Pipeline Activity 5m
- Querying an On-premises Table by Using a Self-hosted Integration Runtime 5m
- Copying Data Incrementally from On-premises to Azure SQL Database 5m
- Executing a Parameterized Stored Procedure in an Azure Data Factory Pipeline 3m
- Running and Monitoring a Pipeline Execution 4m
- Summary 1m
- Introduction 1m
- Identifying Azure Blob Storage Assets 1m
- Getting the Amazon's S3 Access Key 2m
- Creating the Pipeline and Copy Activity 1m
- Creating the Source Dataset and Linked Service 3m
- Creating the Sink Dataset and Linked Service 2m
- Validating, Debugging, Publishing, Triggering, and Monitoring Pipeline Execution 4m
- Summary 1m
- Introduction 1m
- Creating the Pipeline and Pipeline Parameters 1m
- Adding a GetMetaData Activity to the Pipeline 3m
- Adding a Filter Activity to the Pipeline 2m
- Creating a Reusable Dataset 2m
- Adding a ForEach Activity to the Pipeline 5m
- Introducing Azure Databricks 3m
- Transforming Data with a Databricks Notebook Activity 5m
- Sending Transactional Email with Azure LogicApps 4m
- Deleting Staging Files with a Delete Activity 2m
- Executing and Monitoring Pipeline Execution 2m
- Creating a Trigger That Runs a Pipeline on a Schedule 1m
- Introducing Data Flows 2m
- Learning About Other Ways to Create Azure Data Factory Resources 2m
- Learning About Other Type of Integrations That ADF Makes Possible 2m
- Summary 1m
- Introduction 1m
- Creating and Configuring EventHubs 2m
- Introducing the Air Pollution Sensor Generator 2m
- Creating a Stream Analytics Job 2m
- Creating a Stream Analytics Job Input 3m
- Creating a Stream Analytics Job Reference Input 1m
- Creating a Stream Analytics Job Output 2m
- Creating a Stream Analytics Job Query 3m
- Starting a Stream Analytics Job and Reviewing Monitoring Tools 2m
- Summary 1m