Working with VPC Flow Logs for Network Monitoring in AWS

1 hour
  • 4 Learning Objectives

About this Hands-on Lab

This hands-on lab gives you the opportunity to work with VPC Flow Logs. It will teach various ways to review VPC Flow Logs and monitor networks. A common way people evaluate VPC Flow Logs is by sending them to CloudWatch. Once they’re in CloudWatch, people can create metrics, then set alarms based on those metrics. Additionally, it’s possible to export CloudWatch Logs to S3. In fact, it’s possible to export VPC Flow Logs directly to S3. The advantages of this are cost savings and ease of use. In this hands-on lab, we will go through the process of exporting VPC Flow Logs to S3, as well as use Amazon Athena to query those Flow Logs.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Create an S3 Bucket
  1. Navigate to S3.
  2. Click Create Bucket.
  3. Give the bucket a unique name (e.g., "vpcflow4learningactivity" and a series of numbers at the end, like the account ID of the AWS account provisioned with the lab, to make it globally unique).
  4. Click Next three times.
  5. Click Create Bucket.
  6. Click to open your newly created bucket.
  7. Click Create folder.
  8. In the box next to the folder, enter "AWSLogs".
  9. Click Save.
  10. Click Create folder.
  11. In the box next to the folder, enter "QueryResults".
  12. Click Save.
Configure VPC Flow Logs
  1. Navigate to VPC > Your VPCs.
  2. Select the LinuxAcademy VPC.
  3. Click Actions > Create flow log.
  4. Set the following values:
    • Filter: All
    • Destination: Send to an S3 bucket
    • S3 bucket ARN: arn:aws:s3:::<YOUR_BUCKET_NAME>
  5. Click Create.
  6. Click the Flow Logs tab to verify the flow log exists.
Create and Query a Sample Table in Amazon Athena
  1. In Athena, specify the QueryResults folder in the S3 bucket as the query results location.
  2. Use the Athena tutorial to create a sample table.
  3. Run a select * from query on the table.
  4. Edit the query by replacing * with request_ip and run it again.
Configure and Query the VPC Flow Logs
  1. In S3, verify that logs have populated the AWSLogs folder in the vpcflow4learningactivity bucket.
  2. In Athena, run the scripts provided on the lab page.

Additional Resources

Log in to the live AWS environment using the credentials provided.

Make sure you're in the N. Virginia (us-east-1) region throughout the lab.

Use this script to create the Athena table for VPC Flow Logs:

CREATE EXTERNAL TABLE IF NOT EXISTS vpc_flow_logs (
  version int,
  account string,
  interfaceid string,
  sourceaddress string,
  destinationaddress string,
  sourceport int,
  destinationport int,
  protocol int,
  numpackets int,
  numbytes bigint,
  starttime int,
  endtime int,
  action string,
  logstatus string
)  
PARTITIONED BY (dt string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ' '
LOCATION 's3://your_log_bucket/prefix/AWSLogs/{subscribe_account_id}/vpcflowlogs/{region_code}/'
TBLPROPERTIES ("skip.header.line.count"="1");

Use this script to account for the table partitions:

ALTER TABLE vpc_flow_logs
ADD PARTITION (dt='YYYY-MM-dd')
location 's3://your_log_bucket/prefix/AWSLogs/{account_id}/vpcflowlogs/{region_code}/YYYY/MM/dd';

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?