Using Terraform to Auto Scale and Load Balance Compute Engine Instances in GCP

1 hour
  • 4 Learning Objectives

About this Hands-on Lab

Learning how to configure complex environments with Terraform is a must-have skill. In this hands-on lab, we will provision an autoscaling group with a load balancer.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Create a Service Account
  1. From Google Cloud console’s main navigation, choose IAM & Admin > Service Accounts.
  2. Click Create service account.
  3. Give your service account a name.
  4. Click Create.
  5. In the roles dropdown, select Project > Owner.
  6. Click Continue and then Done.
Log in to the Host Instance and Ensure Terraform Is Installed
  1. From Google Cloud navigation, choose Compute Engine > VM instances.

  2. Click SSH next to terraform-instance.

  3. Use root privileges:

    sudo -i
  4. Change into the root directory:

    cd /
  5. Input the path to communicate with Terraform into the /etc/profile file:

    echo "PATH='$PATH:/downloads/'" >> /etc/profile
  6. Run the following in order to be able to call Terraform:

    source /etc/profile
  7. Call Terraform:

Create a Service Account Key within the Instance
  1. Allow the SDK to communicate with GCP:

    gcloud auth login
  2. Enter Y at the prompt.

  3. Click on the link in the output.

  4. Select the Cloud Student account.

  5. Click Allow.

  6. Copy the code provided.

  7. Paste the code into the terminal.

  8. Create the service account key:

    gcloud iam service-accounts keys create /downloads/auto-scaling.json --iam-account <SERVICE_ACCOUNT_EMAIL>
Create and Deploy the Configuration File
  1. Create a file:

  2. Paste in the following, replacing all instances of <PROJECT_NAME> with your project name, which can be found in the top navigation bar of the Google Cloud console:

    provider "google" {
      version = "3.20.0"
      credentials = file("/downloads/auto-scaling.json")
      project = "<PROJECT_NAME>"
      region  = "us-central1"
      zone    = "us-central1-c"
    resource "google_compute_network" "vpc_network" {
      name = "new-terraform-network"
    resource "google_compute_autoscaler" "foobar" {
      name   = "my-autoscaler"
      project = "<PROJECT_NAME>"
      zone   = "us-central1-c"
      target = google_compute_instance_group_manager.foobar.self_link
      autoscaling_policy {
        max_replicas    = 5
        min_replicas    = 2
        cooldown_period = 60
        cpu_utilization {
          target = 0.5
    resource "google_compute_instance_template" "foobar" {
      name           = "my-instance-template"
      machine_type   = "n1-standard-1"
      can_ip_forward = false
      project = "<PROJECT_NAME>"
      tags = ["foo", "bar", "allow-lb-service"]
      disk {
        source_image = data.google_compute_image.centos_7.self_link
      network_interface {
        network =
      metadata = {
        foo = "bar"
      service_account {
        scopes = ["userinfo-email", "compute-ro", "storage-ro"]
    resource "google_compute_target_pool" "foobar" {
      name = "my-target-pool"
      project = "<PROJECT_NAME>"
      region = "us-central1"
    resource "google_compute_instance_group_manager" "foobar" {
      name = "my-igm"
      zone = "us-central1-c"
      project = "<PROJECT_NAME>"
      version {
        instance_template  = google_compute_instance_template.foobar.self_link
        name               = "primary"
      target_pools       = [google_compute_target_pool.foobar.self_link]
      base_instance_name = "terraform"
    data "google_compute_image" "centos_7" {
      family  = "centos-7"
      project = "centos-cloud"
    module "lb" {
      source  = "GoogleCloudPlatform/lb/google"
      version = "2.2.0"
      region       = "us-central1"
      name         = "load-balancer"
      service_port = 80
      target_tags  = ["my-target-pool"]
      network      =

Additional Resources

Your team lead wants you to take charge and create an autoscaling group with a load balancer so they can manually set it to target a pool in the backend. They want it to be done as quickly as possible so they can test, destroy and make changes if possible.

Note: Please give this lab some extra provisioning time before connecting via ssh.

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?