Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Azure icon
Labs

Creating a Data Pipeline Using Azure Synapse Pipelines

Data pipelines can be created in Azure Data Factory, as well as Azure Synapse Pipelines. Here we’ll build a two-step pipeline using Azure Synapse Pipelines. First, we'll copy the data from Azure SQL to a data lake, then from the data lake to a dedicated SQL pool table using PolyBase.

Azure icon
Labs

Path Info

Level
Clock icon Intermediate
Duration
Clock icon 1h 0m
Published
Clock icon Nov 09, 2023

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Set Up the Environment

    1. Create an Azure Synapse Analytics instance by defining a new Azure Data Lake Gen2 account.
    2. Create a container named staging in the data lake account.
  2. Challenge

    Set Up Dedicated SQL Pool Instance

    Within Synapse workspace, create a dedicated SQL pool instance named TaxiRidesWarehouse, with the performance level of DW100c.

  3. Challenge

    Create a Pipeline to Copy Data from Azure SQL Table to Data Lake File

    1. Create a linked service for the Azure SQL database. Credentials to connect to Azure SQL are available in Additional Information and Resources section.
    2. Create a linked service for the data lake.
    3. Create an integration dataset for Azure SQL table: SalesLT.Customer.
    4. Create an integration dataset for the data lake file. Use the format as Parquet. Keep the file in the staging container, and set the import schema to none.
    5. Create a pipeline with a copy activity. Set source as Azure SQL table dataset, and sink as Data Lake file dataset.
    6. Update the mappings in the copy activity:
      1. Rename the CustomerID column to ID.
      2. Keep the following columns only: Title, FirstName, MiddleName, LastName, Suffix, CompanyName, SalesPerson, EmailAddress, and Phone.
    7. Run the pipeline and verify that the file is created successfully in data lake.
  4. Challenge

    Update Pipeline to Copy Data from Data Lake File to Dedicated SQL Pool Table

    1. Create a DimCustomer table in the dedicated SQL pool. The script to create it is available in the Additional Information and Resources section.
    2. Create an integration dataset for the dedicated SQL pool table, DimCustomer. Use Azure Synapse dedicated SQL pool as the data store.
    3. In the existing pipeline, add another copy activity. Set source as data lake file dataset, and sink as dedicated SQL pool table dataset.
    4. In the sink of the second copy activity, set Copy method as PolyBase.
    5. Connect both copy activities using the succeeded dependency. First data should be copied from Azure SQL to data lake, and then from data lake to dedicated SQL pool.
    6. Run the pipeline and verify that data is successfully loaded in the DimCustomer table.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans