Creating a Data Pipeline in Azure Data Factory

1 hour
  • 5 Learning Objectives

About this Hands-on Lab

In this lab, you will learn how to create an Azure Data Factory instance; you will also build and run a multi-step pipeline. In addition, you will learn how to monitor your pipeline and create activities that flow in a logical progression, including failure and success dependency conditions into the pipeline build.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Prepare the Environment

Create a Data Factory Instance

  • Use the given Resource Group and Subscription available in the dropdown menu.
  • Instance name should be unique.
  • Region should be East US and Version should be V2.
  • Git is not required.
  • Public endpoint is required.

Create a SQL Database

  • Use the given Resource Group and Subscription available in the dropdown menu.
  • Database name should be unique.
  • Create a new server in East US region.
  • Service tier should be Basic DTU-based with .5 GB Data max size.
  • Locally-redundant backup storage.
  • Make sure to enable Sample data.

Create a Blob Storage

  • Use the given Resource Group and Subscription available in the dropdown menu.
  • Storage account name should be unique.
  • Region should be East US region.
  • Performance should be standard.
  • Locally-redudant.
  • Network access should enable public access from all networks.
  • Upload the AdventureWorks Sales-2.xlsx GitHub file into a new blob container named delete.
Create a Copy Data Activity
  • Create a Copy Data activity and name it SQL DB.
  • The source is the SalesLT.Product table in SQL DB .
  • The sink is a new table named SalesLT.Product2 in SQL DB.
Delete and the First Web Activity
  • Create a Web trigger, whether the Copy Data activity succeeds or fails.
  • It should trigger QuackersCrazySales.Ducks?sales=NA2.
  • Create a Delete activity to delete all files in the new container in our Blob Storage account.
  • This will only activate if the data was successfully copied in Objective 2.
Failure and a Final Web Trigger
  • Create a Fail trigger to activate if the Delete activity fails.
  • Enter the following Error Message, The Delete Activity Failed to Run. with the Error Code as 1Quack2.
  • If the Delete activity fails then a Web activity should trigger QuackersCrazySales.Ducks?delete_error=NA2.
Validate and Debug

In this final step, the newly created pipeline should be validated and debugged.

Additional Resources

In this hands-on lab scenario, you have been asked to step in for a retail company that specializes in squeaker toys. They are attempting to construct a data pipeline that can eventually be used for data mining and report generation by moving data around in their environment.

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?