Automating AWS with Lambda Python and Boto3
Share on facebook
Share on twitter
Share on linkedin

Scheduling Amazon DynamoDB Backups with Lambda, Python, and Boto3


Let’s assume you want to make a backup of one of your DynamoDB tables each day. We also want to retain backups for a specified period of time.A simple way to achieve this is to use an Amazon CloudWatch Events rule to trigger an AWS Lambda function daily. In this hands-on AWS lab, you will write a Lambda function in Python using the Boto3 library.Setting this up requires configuring an IAM role, setting a CloudWatch rule, and creating a Lambda function.

Create the DynamoDB Table

You can certainly use any DynamoDB table you have in your account for this exercise, but if you want to create one using the AWS CLI, you may use the following command:

aws dynamodb create-table --table-name Person  --attribute-definitions AttributeName=id,AttributeType=N  --key-schema AttributeName=id,KeyType=HASH  --billing-mode=PAY_PER_REQUEST

This will create a DynamoDB table called Person, with a primary key id.

Create the IAM Execution Role

All Lambda functions require an IAM role that defines the permissions granted to it. This is referred to as the Lambda function’s execution role.First, we’ll walk through the process of authoring our IAM role for the Lambda function and creating the Lambda function itself.We’ll be using the AWS Management Console for this task:

  1. Navigate to IAM.
  2. Navigate to Policies.
  3. Click Create Policy.
  4. Select the JSON tab.
  5. Replace the default content with the following JSON statement:
{   "Version":"2012-10-17",   "Statement":[      {         "Effect":"Allow",         "Action":[            "logs:CreateLogGroup",            "logs:CreateLogStream",            "logs:PutLogEvents"         ],         "Resource":"arn:aws:logs:*:*:*"      },      {         "Action":[            "dynamodb:CreateBackup",            "dynamodb:DeleteBackup",            "dynamodb:ListBackups"         ],         "Effect":"Allow",         "Resource":"*"      }   ]}

This statement grants two sets of permissions. First, it grants the ability to log to CloudWatch Logs. With this permission, any Python print() statements will display in CloudWatch Logs.Second, we grant permission for the Lambda function to create, list, and delete DynamoDB backups on all tables.

  1. Click Review Policy.
  2. Name this policy LambdaBackupDynamoDBPolicy.
  3. Click Create Policy.

Now that the policy is created, you must create a role to which this policy is attached.

  1. Within IAM, navigate to Roles.
  2. Click Create Role.
  3. Select the type of trusted entity: AWS service.
  4. Choose the service that will use this role: Lambda.
  5. Click Next: Permissions.
  6. In the search box, find the LambdaBackupDynamoDBPolicy created in the previous step.
  7. Check the checkbox next to the policy name.
  8. Click Next: Tags.
  9. Click Next: Review.
  10. Role name: LambdaBackupDynamoDBRole.
  11. Click Create role.

Create the Lambda Function

Let’s create our Lambda function!

  1. Navigate to Lambda.
  2. Click Create function.
  3. Select Author from scratch.
  4. Function name: BackupDynamoDB.
  5. Runtime: Python 3.7.
  6. Under Permissions, select Choose or create an execution role.
  7. Under Execution Role, select Use an existing role.
  8. Under Existing Role, select LambdaBackupDynamoDBRole, created in the previous step.
  9. Click Create function.

Paste the following source code into the Lambda function’s code editor:Click Save at the top right of the screen.

Create a CloudWatch Rule

Next, we’ll create a CloudWatch rule to schedule the Lambda function to run at regular intervals. This will perform backups of the DynamoDB table and remove stale backups.

  1. Navigate to CloudWatch.
  2. Navigate to Events > Rules.
  3. Click Create rule.
  4. Schedule event to run at the desired interval (e.g., every 1 day).
  5. Click Add target.
  6. Under Lambda function, select  BackupDynamoDB.
  7. Under Configure input, select Constant(JSON text).
  8. Set the value to the JSON statement:
    {"TableName": "Person"}
  9. Click Configure details.
  10. Name: BackupDynamoDBDaily (or whatever you prefer).
  11. Click Create rule.
  12. Wait for the CloudWatch rule to trigger the next backup job you have scheduled. If you’re impatient like me, you can set the schedule interval to 1 minute, and you’ll see it run sooner.
  13. Verify the scheduled backup job ran using CloudWatch Logs. The Log Group will be named /aws/lambda/BackupDynamoDB, with a stream for each invocation.
  14. Verify the backup file exists in the list of DynamoDB backups.

Want to Learn More?

I hope you’ll find this technique useful in your own work.If you want to learn more useful AWS automation techniques like this, check out my new course Automating AWS with Lambda, Python, and Boto3.


Get more insights, news, and assorted awesomeness around all things cloud learning.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?