Using Data Pipeline to Export DynamoDB Data to S3

30 minutes
  • 2 Learning Objectives

About this Hands-on Lab

In this lab, we are going to use Data Pipeline to copy DynamoDB data to an S3 bucket as a backup. We’ll learn different ways that this can be done with DynamoDB and Data Pipeline to create backups of the DynamoDB data.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Copy Subnet ID

Before we can create a data pipeline, we’ll need the ID of the S3 bucket that we are going to output data to. In a web browser, navigate to S3 in the AWS console. We’ve been provided with a bucket. Click into that bucket, and copy the ID up near the top of the screen (it should start with cfst-).

  1. In the VPC console, select Subnets.
  2. Copy the Subnet ID that has an internet gateway attached to its route table.
Create Data Pipeline
  1. Add a name.
  2. In Build Using a Template, under DynamoDB Tables, choose Export DynamoDB Table to S3.
  3. Under Parameters, add a table named LinuxAcademy
  4. Select the provided S3 Bucket.
  5. For IAM Roles, choose the long IAM role provided with this lab for both the Pipeline Role and the EC2 instance role.
  6. Select Edit in Architect.
  7. Click into the optional fields, and paste in the Subnet ID.
  8. Update Core Instance Type and Master Instance Type to m4.large.
  9. Under Activities, set the Resize Cluster Before Running to false.
  10. Save the pipeline
  11. Activate the pipeline.

Additional Resources

How to Begin

Please go to the AWS Console using the link provided after lab creation is complete.

Log in using the credentials provided to you. You should have been given a user name of cloud_user and a randomly generated password.

Please make sure you are in the us-east-1 region before beginning.

Other Uses

One more thing we can do is move data the other way from S3 back into DynamoDB. We're not going to actually do it, but just to get an idea of how it works, you can go back into the Data Pipeline console after completing this lab. Click List Pipelines up at the top, so we've got a fresh screen, and then click Create new pipeline. Take a look in the top section at the Source. If we select Build using a template, then click the Choose... dropdown, we can see Export DynamoDB table to S3 (what we chose last time we were here). But right below that is Import DynamoDB backup data from S3. Outstanding!

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Get Started
Who’s going to be learning?

How many seats do you need?

  • $499 USD per seat per year
  • Billed Annually
  • Renews in 12 months

Ready to accelerate learning?

For over 25 licenses, a member of our sales team will walk you through a custom tailored solution for your business.


$2,495.00

Checkout
Sign In
Welcome Back!
Thanks for reaching out!

You’ll hear from us shortly. In the meantime, why not check out what our customers have to say about ACG?