February 18, 2019 was a very important day for AWS DevOps Professional Certification aspirants. This is the day that AWS introduced their new exam after having put the exam through a Beta phase in November, 2018. The new exam has a few new wrinkles and I’ve been working feverishly to update the existing AWS DevOps Pro Certification course. I am adding new sections to this AWS DevOps certification course to accommodate for the newly updated exam, including:
- Deployment Pipelines
- AWS Lambda
- AWS API Gateway
- AWS Secrets Manager
Event Sources and Event Triggers
Now for a sneak preview into one of the 25 new lessons already added to the course, let’s talk about AWS Lambda and specifically using AWS SQS as an Event Source for Lambda. Lambda is an event-driven service and can be triggered by any number of events from many different Event Sources. Examples of Event Sources and Event Triggers include adding files (the trigger) to an S3 Bucket (the Event Source), inputting orders into a DynamoDB table, and adding messages to an SQS queue. Now, these are just a few examples among many options, but we’ll focus on SQS.
Using AWS SQS as an Event Source for Lambda
Let’s walk through how you can use both the AWS Management Console and the AWS CLI to configure a Lambda function. Start by using the AWS Management Console to create an SQS queue and send test messages to the queue. This way you can verify proper configuration of SQS as your Lambda Event Source. Now before doing anything with Lambda, set up a Lambda Execution Role. The Lambda Execution Role grants your Lambda Function permissions to access AWS Services. In this particular case, your Lambda Function is going to need permission to communicate with AWS SQS. As you might expect, the Lambda Execution Role can be created in IAM. So when you go over to IAM and create a Role, don’t forget to attach a Permissions Policy to your Role. Specifically, a Permissions Policy granting permissions to work with SQS:
Launching an EC2 Instance and Creating a Lambda Function
Now the next step is to launch an EC2 instance and assign it a public ip address so that you can then SSH into the instance. This will allow you to issue AWS Lambda commands from the Command Line Interface to configure your Lambda Function. Let’s use node.js and your Lambda Function is housed in a file named index.js. All you have to do is zip this file up and you are well on your way to creating your Lambda Function. Once you have the zip file containing your Lambda Function, use the AWS CLI command:
aws lambda create-function command to create your Lambda Function (check out this lesson for details).
Creating an SQS Queue
After the Lambda Function is created from the CLI, use the AWS Management Console to create your SQS queue:
Creating an Event Source Mapping
So you now have a Lambda Function and an Event Source (SQS queue). But how does your Event Source know which Lambda Function to trigger? After all, you could have hundreds of Lambda Functions in our repository. You need to create an Event Source Mapping. This maps our SQS queue to our Lambda Function, so that when a trigger action occurs (A new message sent to our queue), the correct Lambda Function will be triggered. We do this with the CLI command:
aws lambda create-event-source-mapping for more details, check out the lesson here(). So to summarize, you have:
- Created a Lambda Execution Role in IAM
- Launched an EC2 instance and SSH’d in to issue CLI commands
- Created a Lambda Function from the CLI
- Created an SQS Queue from the Management Console
- Created an Event Source Mapping from the CLI
Taken as a whole, maybe a little complicated. But broken down step by step, each step is relatively simple. Everything is in place and your Lambda Function is sitting there waiting to be triggered. And remember, with Lambda, you are only charged for when your Lambda Function is executing! So how do you trigger your Lambda Function to test? You go to SQS in the AWS Management Console and start sending messages to our SQS queue. This will trigger a Lambda Function into action! It’s just that simple. If you’re looking to go a bit more in-depth, check out this lesson here.
AWS Certified DevOps Engineer – Professional Level
My goal has been to get as many of the lessons for these new sections into the course as possible by February 18. And I’m happy to say that I’ve completed all of the lessons for Deployment Pipelines and they are now in the existing course. I’ve completed 5 of the 7 lessons for Lambda and they are now in the course. API Gateway work is nearly complete and the lessons will be added in the next few days and I hope to have all of the Secrets Manager lessons added by the beginning of next week. I will then circle back and start adding other features to these new sections such as Quizzes and Labs. So keep your eyes open for additional announcements in the next few weeks on new content in the AWS DevOps Pro Certification course – 25 new lessons (and counting) for the newly released AWS DevOps Pro Certification exam is now available!