Google Certified Associate Cloud Engineer 2020

Sign Up Free or Log In to participate!

How to process multiple events streams with Pub/Sub and BigQuery?

Hi! I have a few events streams (video impressions, api calls etc.) and want to save all those logs. Currently I’m caching it in redis or mongodb and daily job uploads csv files to google cloud storage.

But while learning more google cloud I found for myself almost automated solution to save logs – publish events to Pub/Sub and save into BigQuery with a Dataflow streaming job template "Pub/Sub to  BigQuery". And logically I wanted to create new Pub/Sub topic for each event type, but one streaming job costs at least 50-60$ per month, which doesn’t make sense to create a new job for each topic (event type). So, my idea is to have one topic and one BigQuery table to store all logs, and then run batch job that will process logs by event type and save to respective table.

What do you think, is it the right way to go? Or maybe there are some other options?


0 Answers

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?