Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Labs

Viewing Cloud IoT Core Data Using BigQuery

Welcome to this hands-on lab, *Viewing IoT Core Data Using BigQuery*. Google Cloud IoT Core is a fully managed service that manages and ingests data from millions of globally dispersed devices. You will gain experience by using IoT Core to collect data from a single device. However, you can use the same configuration steps to collect data from millions of devices. In this lab, you will: - Get hands-on with IoT Core to create a registry, add a device, and send simulated data from a compute engine VM. - Configure a Cloud Dataflow template job to collect the data from a Pub/Sub topic, transform the data from JSON to table format, and store the data in BigQuery. - Use BigQuery to query the data, sort it by the time collected to prove you have configured the IoT data pipeline to move data from Pub/Sub using DataFlow, and store it in BigQuery in the correct format. - Export the data to Data Studio and display the heart rate data over time collected.

Labs

Path Info

Level
Clock icon Intermediate
Duration
Clock icon 1h 30m
Published
Clock icon Nov 06, 2020

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Ingest the IoT Core Data

    1. Create a Google Cloud IoT Core registry called us-iot-hr-trial, a Pub/Sub topic called us-iot-hr-queue, and a Pub/Sub subscription caled us-iot-hr-data.

    2. Register the VM hrsensor007 in IoT Core and create a public/private key pair.

    3. Send simulated data to the IoT Core device using the heartrateSimulator.py script on the VM.

  2. Challenge

    Build a Cloud Dataflow Pipeline

    1. Create a BigQuery dataset called heartratedata and a table called heartratedatatable.

    2. Create a Cloud Storage bucket and collect the endpoints for the Pub/Sub subscription and the Cloud Storage bucket.

    3. Send the Dataflow template and Pub/Sub subscription to BigQuery using the Dataflow job name, BigQuery table information, Pub/Sub subscription link, and bucket location.

  3. Challenge

    View Our Data in BigQuery

    Run a query to view the IoT data that was received in BigQuery, and query the data to return timecollected. Once the data has been viewed in BigQuery, export the results to Data Studio and observe the data in a graphical format.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans