Azure Data Factory is a core service for any Azure cloud project. It is an orchestration service responsible for the movement and automation of data into and throughout the Azure cloud. In this lab, we will learn how to connect data sources and create a data pipeline that will move data in Azure. In this scenario, you work for a company selling various items online all around the United States. You have been asked to research a way to combine sales data with other data sources, such as parking lot information to highlight potential new brick-and-mortar stores. In order to do this, you are going to provide a presentation on leveraging cloud services (specifically Data Factory) to automate this research.
https://docs.microsoft.com/en-us/learn/modules/data-integration-azure-data-factory/
https://docs.microsoft.com/en-us/learn/modules/orchestrate-data-movement-transformation-azure-data-factory/
https://docs.microsoft.com/en-us/learn/modules/receive-data-with-azure-data-share-transforming-with-azure-data-factory/
https://docs.microsoft.com/en-us/learn/modules/create-production-workloads-azure-databricks-azure-data-factory/
Learning Objectives
Successfully complete this lab by achieving the following learning objectives:
- Prepare the Environment
Provision the following:
- A Data Factory instance
- West US 2 region, V2, configure Git later
- SQL databases
- Create a server in East US
- Ensure it’s set to 5 (Basic) DTUs at 2 GB
- A storage account
- West US 2, Standard Performance, Storage V2, RA-GRS
- Create 2 containers (
raw
,curated
)
- A Data Factory instance
- Create and Connect Datasets
Connect the SQL database (
SalesLT.Address
table) and storage account from the previous step. Then, download a CSV file and add it to the blob storage account for later use.- Create the Copy Steps of Our Pipeline
Create a copy step to pull the
SalesLT.Address
data from the SQL database that will output to the storage account blob container. Then, validate and execute this pipeline.- Use Data Flow to Combine Data from our Copied File and CSV File
Use Data Flow to add data from the newly copied file to the CSV file. This new data will then be output to a new blob file in the
curated
container.- Publish the Pipeline
Publish the completed pipeline.