Handling Streaming Messages with Cloud Pub/Sub

30 minutes
  • 4 Learning Objectives

About this Hands-on Lab

One of the primary benefits of Cloud Pub/Sub is its ability to handle streaming data, as well as occasional and batch data. Streaming data could come from many sources, including multiple Internet of Things devices. In this hands-on lab, we’ll learn how to set up a Cloud Pub/Sub topic and subscription, simulate streaming data from traffic sensors, and pull multiple records of data from the subscription.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Create a Topic
  1. From the main navigation menu, select Pub/Sub > Topics.
  2. Click Create a topic.
  3. Enter a name for the topic, such as "la-streaming-topic".
  4. Keep the Encryption option at its default setting.
  5. Click Create Topic.
Create a Subscription
  1. Drill down into the topic we just created and choose Create Subscription from the bottom of the page in the Subscriptions section.
  2. Enter a name for the subscription, such as "la-streaming-subscription".
  3. The Cloud Pub/Sub topic must be selected
  4. Set Delivery Type to Pull.
  5. Under Retain acknowledged messages, click the Enable option.
  6. Leave all the other options as their defaults.
  7. Click Create.
Retrieve the Files
  1. From the top navigation, click Activate Cloud Shell.
  2. In the Cloud Shell, enter the following command to clone the GitHub repository:
    git clone https://github.com/ACloudGuru-Resources/training-data-analyst.git
  3. Change to the training-data-analyst/courses/streaming/publish directory:
    cd training-data-analyst/courses/streaming/publish
  4. Copy the data file from a Cloud Storage bucket:
    gsutil cp gs://cloud-training-demos/sandiego/sensor_obs2008.csv.gz .
  5. Open the Cloud Shell code editor.
  6. Review the file send_sensor_data.py in the training-data-analyst/courses/streaming/publish folder.
  7. On line 26, change the TOPIC variable from sandiego to the last part of your topic name (following the final ‘/’).
  8. Save the file.
Stream Data and Confirm Operation
  1. Enable the Resource Manager API
    gcloud services enable cloudresourcemanager.googleapis.com
  2. Authenticate the shell with the following code:
    gcloud auth application-default login --no-launch-browser
  3. Click the generated link to confirm the authentication.
  4. Execute the following command to install the Google Cloud Pub/Sub library:
    sudo pip3 install google-cloud-pubsub
  5. Execute the following code to simulate streaming data:
    ./send_sensor_data.py --speedFactor=60 --project=[PROJECT_ID]
  6. Create a new Cloud Shell instance by clicking the plus (+) icon.
  7. Change directory to the working folder:
    cd training-data-analyst/courses/streaming/publish
  8. Pull the messages from the subscription with the following command:
    gcloud pubsub subscriptions pull --auto-ack [SUBSCRIPTION_NAME] --limit=25

Additional Resources

Your company has just been hired to track all the sensor data coming from traffic sensors in a major American city. You’ve been asked to verify that Cloud Pub/Sub is capable of handling on-going streaming data.

You’ll need to complete the following steps to accomplish your task:

  1. Enable Cloud Pub/Sub.
  2. Create a Pub/Sub topic.
  3. Create a Pub/Sub subscription.
  4. Retrieve the necessary files.
  5. Stream data.
  6. Retrieve messages.

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?