Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Google Cloud Platform icon
Labs

Working with AWS VPC Flow Logs for Network Monitoring

**Thank you for your interest in this content. Unfortunately, this content is no longer being updated and some of it may be out-of-date.** Monitoring network traffic is a critical component of security best practices to meet compliance requirements, investigate security incidents, track key metrics, and configure automated notifications. AWS VPC Flow Logs captures information about the IP traffic going to and from network interfaces in your VPC. In this hands-on lab, we will set up and use VPC Flow Logs published to Amazon CloudWatch, create custom metrics and alerts based on the CloudWatch logs to understand trends and receive notifications for potential security issues, and use Amazon Athena to query and analyze VPC Flow Logs stored in S3.

Google Cloud Platform icon
Labs

Path Info

Level
Clock icon Advanced
Duration
Clock icon 2h 0m
Published
Clock icon Nov 17, 2021

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Create an S3 Bucket for VPC Flow Logs and VPC Flow Log to S3

    Create an S3 Bucket for VPC Flow Logs

    Create an S3 bucket, and Click Copy Bucket ARN from the popup window. Paste the ARN in your favorite text editor. We will use the S3 ARN in an upcoming task.

    Create VPC Flow Log to S3

    Make sure the destination is the S3 bucket you just created.

  2. Challenge

    Create CloudWatch Log Group and VPC Flow Log to CloudWatch

    Create CloudWatch Log Group

    Enter VPCFlowLogs for the name.

    Create VPC Flow Log to CloudWatch

    Make sure to select Send to CloudWatch Logs for the Destination.

  3. Challenge

    Generate Traffic

    1. Log in to the EC2 instance via SSH using the credentials provided.
    2. Exit the terminal.
    3. Change the security group on the instance to only allow HTTP access.
    4. Attempt to log in again.
    5. Change the security group on the instance to allow SHH and HTTP access.
    6. Log in again.
    7. Exit the terminal.
  4. Challenge

    Create CloudWatch Filters and Alerts

    Create CloudWatch Log Metric Filter

    Add a metric filer to a CloudWatch log

    1. Enter the following Filter Pattern to track failed SSH attempts on port 22:

      [version, account, eni, source, destination, srcport, destport="22", protocol="6", packets, bytes, windowstart, windowend, action="REJECT", flowlogstatus]
      
    2. In the Select Log Data to Test dropdown, select Custom Log Data.

    3. Enter the following in the text box:

      2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 22 6 1 40 1563108188 1563108227 REJECT OK
      2 086112738802 eni-0d5d75b41f9befe9e 182.68.238.8 172.31.83.158 42227 22 6 1 44 1563109030 1563109067 REJECT OK
      2 086112738802 eni-0d5d75b41f9befe9e 42.171.23.181 172.31.83.158 52417 22 6 24 4065 1563191069 1563191121 ACCEPT OK
      2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 80 6 1 40 1563108188 1563108227 REJECT OK
      
    4. Click Test Pattern.

    5. Click on the Show test results link. You should see 2 of the 4 records matching.

    6. Click Assign Metric.

    7. Enter destination-port-22-rejects as the Filter Name.

    8. Enter SSH failures as the Metric Name.

    Create Alarm Based on the Metric Filter

    1. Click on the Create Alarm link in the new destination-port-22-rejects metric filter box.

    2. Select the Greater/Equal radio button.

    3. Enter 1 in the Define the threshold value edit box.

    4. Select the Create new topic button.

    5. Enter PROD-ALERT-{your-initials} in the Create a new topic... edit box.

    6. Enter your email address in the Email endpoints that will receive the notification edit box.

    7. Enter SSH REJECT as the Alarm name.

    8. You will receive an email notification asking you to confirm your subscription. Click the Confirm Subscription link in the email.

  5. Challenge

    Generate Traffic for Alerts

    1. Log in to the EC2 instance via SSH using the credentials provided.
    2. Exit the terminal.
    3. Change the security group on the instance to only allow HTTP access.
    4. Attempt to log in again.
    5. Change the security group on the instance to allow SHH and HTTP access.
    6. Log in again.
    7. Exit the terminal.
  6. Challenge

    Use CloudWatch Insights

    1. Navigate to CloudWatch.
    2. Click Insights in the left-hand menu.
    3. Select VPCFlowLogs in the Select a log group search window.
    4. Click Sample queries > VPC flow log queries > Top 20 source IP addresses with highest number of rejected requests.
    5. Observe the query has changed.
    6. Click Run query.
  7. Challenge

    Configure VPC Flow Logs Table and Partition in Athena

    Record Reference Information to Be Used in Athena Queries Using the S3 Bucket You Created

    Create the Athena Table

    1. Paste the following DDL code in the new query window:

      CREATE EXTERNAL TABLE IF NOT EXISTS default.vpc_flow_logs (
        version int,
        account string,
        interfaceid string,
        sourceaddress string,
        destinationaddress string,
        sourceport int,
        destinationport int,
        protocol int,
        numpackets int,
        numbytes bigint,
        starttime int,
        endtime int,
        action string,
        logstatus string
      )  
      PARTITIONED BY (dt string)
      ROW FORMAT DELIMITED
      FIELDS TERMINATED BY ' '
      LOCATION 's3://{your_log_bucket}/AWSLogs/{account_id}/vpcflowlogs/us-east-1/'
      TBLPROPERTIES ("skip.header.line.count"="1");
      
    2. Update {your_log_bucket} and {account_id} in the query window with the values from this hands-on lab.

    3. Click Run query.

    4. You should see a Query successful. message once this has finished executing.

    Create Partitions to Be Able to Read the Data

    1. Paste the following code in a new query window:

      ALTER TABLE default.vpc_flow_logs
      ADD PARTITION (dt='{Year}-{Month}-{Day}')
      location 's3://{your_log_bucket}/AWSLogs/{account_id}/vpcflowlogs/us-east-1/{Year}/{Month}/{Day}';
      
    2. Update the following elements in the query window with the values from this hands-on lab:

      • {your_log_bucket}
      • {account_id}
      • {Year},{Month}
      • {Day}
    3. Click Run query.

    4. You should receive a Query successful. message.

  8. Challenge

    Analyze VPC Flow Logs Data in Athena

    Run the following query in a new query window:

    ```
    SELECT day_of_week(from_iso8601_timestamp(dt)) AS
      day,
      dt,
      interfaceid,
      sourceaddress,
      destinationport,
      action,
      protocol
    FROM vpc_flow_logs
    WHERE action = 'REJECT' AND protocol = 6
    order by sourceaddress
    LIMIT 100;
    ```
    

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans