Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Azure icon
Labs

Creating a Data Pipeline in Azure Data Factory

In this lab, you will learn how to create an Azure Data Factory instance; you will also build and run a multi-step pipeline. In addition, you will learn how to monitor your pipeline and create activities that flow in a logical progression, including failure and success dependency conditions into the pipeline build.

Azure icon
Labs

Path Info

Level
Clock icon Intermediate
Duration
Clock icon 1h 0m
Published
Clock icon Sep 20, 2022

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Prepare the Environment

    Create a Data Factory Instance

    • Use the given Resource Group and Subscription available in the dropdown menu.
    • Instance name should be unique.
    • Region should be East US and Version should be V2.
    • Git is not required.
    • Public endpoint is required.

    Create a SQL Database

    • Use the given Resource Group and Subscription available in the dropdown menu.
    • Database name should be unique.
    • Create a new server in East US region.
    • Service tier should be Basic DTU-based with .5 GB Data max size.
    • Locally-redundant backup storage.
    • Make sure to enable Sample data.

    Create a Blob Storage

    • Use the given Resource Group and Subscription available in the dropdown menu.
    • Storage account name should be unique.
    • Region should be East US region.
    • Performance should be standard.
    • Locally-redudant.
    • Network access should enable public access from all networks.
    • Upload the AdventureWorks Sales-2.xlsx GitHub file into a new blob container named delete.
  2. Challenge

    Create a Copy Data Activity

    • Create a Copy Data activity and name it SQL DB.
    • The source is the SalesLT.Product table in SQL DB .
    • The sink is a new table named SalesLT.Product2 in SQL DB.
  3. Challenge

    Delete and the First Web Activity

    • Create a Web trigger, whether the Copy Data activity succeeds or fails.
    • It should trigger QuackersCrazySales.Ducks?sales=NA2.
    • Create a Delete activity to delete all files in the new container in our Blob Storage account.
    • This will only activate if the data was successfully copied in Objective 2.
  4. Challenge

    Failure and a Final Web Trigger

    • Create a Fail trigger to activate if the Delete activity fails.
    • Enter the following Error Message, The Delete Activity Failed to Run. with the Error Code as 1Quack2.
    • If the Delete activity fails then a Web activity should trigger QuackersCrazySales.Ducks?delete_error=NA2.
  5. Challenge

    Validate and Debug

    In this final step, the newly created pipeline should be validated and debugged.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans