Hi! I have a few events streams (video impressions, api calls etc.) and want to save all those logs. Currently I’m caching it in redis or mongodb and daily job uploads csv files to google cloud storage.
But while learning more google cloud I found for myself almost automated solution to save logs – publish events to Pub/Sub and save into BigQuery with a Dataflow streaming job template "Pub/Sub to BigQuery". And logically I wanted to create new Pub/Sub topic for each event type, but one streaming job costs at least 50-60$ per month, which doesn’t make sense to create a new job for each topic (event type). So, my idea is to have one topic and one BigQuery table to store all logs, and then run batch job that will process logs by event type and save to respective table.
What do you think, is it the right way to go? Or maybe there are some other options?
Thanks!