Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Google Cloud Platform icon
Labs

Storing Container Data in AWS S3

Using Docker volumes is the preferred method of storing container data locally. Volume support is built directly into Docker, making it an easy tool to use for storage, as well as more portable. However, storing container data in Docker volumes still requires you to back up the data in those volumes on your own. There is another option - storing your container data in the cloud. It's not a solution for every problem, but after this lab, you'll have another tool at your disposal. This lab will show you how to mount an S3 bucket onto your local system as a directory. You will then mount that directory into your Docker container. We will use an httpd container to serve the contents of that bucket as a webpage, but you can use it to share any common data between containers. This will demonstrate how flexible Docker can be. You can make changes to your bucket and all of your containers using the S3 bucket will near-instantly have access to the content.

Google Cloud Platform icon
Labs

Path Info

Level
Clock icon Intermediate
Duration
Clock icon 30m
Published
Clock icon Nov 20, 2020

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Configuration and Installation

    1. Install the AWS CLI.
    2. Configure the AWS CLI for your user.
    • Use the Access Key ID and Secret Access Key provided in your lab credentials.
    • Use us-east-1 as the default region.
    • Output is optional, but json is a good choice.
    1. Copy the credentials to the root user.
    2. Install s3fs-fuse. This package is in EPEL, which is already installed on the server.
  2. Challenge

    Prepare the Bucket

    1. Create a mount point on the server.
    2. Mount the S3 bucket.
    • Use the bucket name provided to you with the lab credentials. It is helpful to set this as an environment variable.
    • To prevent a lot of extra calls to S3 that will increase your AWS bill, enable local file system caching by setting the use_cache option when mounting.
    1. Copy the website files into the bucket.
    2. Verify the files are in the folder.
    3. Verify the files are in the S3 bucket.
  3. Challenge

    Use the S3 Bucket Files in a Docker Container

    1. Run an httpd container to serve the website. Remember to mount the bucket and publish the web server port.
    2. View the webpage in a browser. Use the server's public IP provided with the lab.
    3. Create a new page in the bucket, called newPage.html, by making a copy of an existing page.
    4. View the new webpage in a browser. This will be at: http://<ServerPublicIP>/newPage.html.
    5. Verify that the new webpage is in the S3 bucket.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans