Scheduling Amazon DynamoDB Backups with Lambda, Python, and Boto3
Jun 08, 2023 • 0 Minute Read
Let's assume you want to make a backup of one of your DynamoDB tables each day. We also want to retain backups for a specified period of time.A simple way to achieve this is to use an Amazon CloudWatch Events rule to trigger an AWS Lambda function daily. In this hands-on AWS lab, you will write a Lambda function in Python using the Boto3 library.Setting this up requires configuring an IAM role, setting a CloudWatch rule, and creating a Lambda function.
Create the DynamoDB Table
You can certainly use any DynamoDB table you have in your account for this exercise, but if you want to create one using the AWS CLI, you may use the following command:
aws dynamodb create-table --table-name Person --attribute-definitions AttributeName=id,AttributeType=N --key-schema AttributeName=id,KeyType=HASH --billing-mode=PAY_PER_REQUEST
This will create a DynamoDB table called Person, with a primary key id.
Create the IAM Execution Role
All Lambda functions require an IAM role that defines the permissions granted to it. This is referred to as the Lambda function's execution role.First, we'll walk through the process of authoring our IAM role for the Lambda function and creating the Lambda function itself.We'll be using the AWS Management Console for this task:
- Navigate to IAM.
- Navigate to Policies.
- Click Create Policy.
- Select the JSON tab.
- Replace the default content with the following JSON statement:
{ "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":[ "logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource":"arn:aws:logs:*:*:*" }, { "Action":[ "dynamodb:CreateBackup", "dynamodb:DeleteBackup", "dynamodb:ListBackups" ], "Effect":"Allow", "Resource":"*" } ]}
This statement grants two sets of permissions. First, it grants the ability to log to CloudWatch Logs. With this permission, any Python print()
statements will display in CloudWatch Logs.Second, we grant permission for the Lambda function to create, list, and delete DynamoDB backups on all tables.
- Click Review Policy.
- Name this policy LambdaBackupDynamoDBPolicy.
- Click Create Policy.
Now that the policy is created, you must create a role to which this policy is attached.
- Within IAM, navigate to Roles.
- Click Create Role.
- Select the type of trusted entity: AWS service.
- Choose the service that will use this role: Lambda.
- Click Next: Permissions.
- In the search box, find the LambdaBackupDynamoDBPolicy created in the previous step.
- Check the checkbox next to the policy name.
- Click Next: Tags.
- Click Next: Review.
- Role name: LambdaBackupDynamoDBRole.
- Click Create role.
Create the Lambda Function
Let's create our Lambda function!
- Navigate to Lambda.
- Click Create function.
- Select Author from scratch.
- Function name: BackupDynamoDB.
- Runtime: Python 3.7.
- Under Permissions, select Choose or create an execution role.
- Under Execution Role, select Use an existing role.
- Under Existing Role, select LambdaBackupDynamoDBRole, created in the previous step.
- Click Create function.
Paste the following source code into the Lambda function's code editor:Click Save at the top right of the screen.
Create a CloudWatch Rule
Next, we'll create a CloudWatch rule to schedule the Lambda function to run at regular intervals. This will perform backups of the DynamoDB table and remove stale backups.
- Navigate to CloudWatch.
- Navigate to Events > Rules.
- Click Create rule.
- Schedule event to run at the desired interval (e.g., every 1 day).
- Click Add target.
- Under Lambda function, select BackupDynamoDB.
- Under Configure input, select Constant(JSON text).
- Set the value to the JSON statement:
{"TableName": "Person"}
- Click Configure details.
- Name: BackupDynamoDBDaily (or whatever you prefer).
- Click Create rule.
- Wait for the CloudWatch rule to trigger the next backup job you have scheduled. If you're impatient like me, you can set the schedule interval to 1 minute, and you'll see it run sooner.
- Verify the scheduled backup job ran using CloudWatch Logs. The Log Group will be named
/aws/lambda/BackupDynamoDB
, with a stream for each invocation. - Verify the backup file exists in the list of DynamoDB backups.
Want to Learn More?
I hope you'll find this technique useful in your own work.If you want to learn more useful AWS automation techniques like this, check out my new course Automating AWS with Lambda, Python, and Boto3.