I’ve set up and torn down the challenge GCS and GCE a couple of times (using scripts). The first time I forgot to give the correct permissions to the service account but, on each subsequent occasion, the log file has appeared in the bucket. However, streaming the logs in Stackdriver was only effective (i.e., all events were captured) on the first run. Since then, each run has yielded only four events. I have SSHed into the VM and catted the syslog and the events are all there. Does anyone have ideas why the logging only showed up properly on the first occasion?
I have a guess that the answer to your question appears at 4:31 in this demo. 🙂
But if you want to try to figure it out with fewer details, I’d suggest that you think about and double-check what it was that you changed between your first attempt (when SD was working but GCS was not working) and second attempt (when GCS started working but SD was no longer working).
And if it turns out my guess is wrong, then please let us know!
Hi Mattias! Thank you for your response. I must admit that I’m still unsure as to what caused the change in behaviour. The only change to the scripts was to append
--scopes=https://www.googleapis.com/auth/devstorage.read_writeto the instance creation call. The ‘read’ part of that is unnecessary, of course, but that can’t be changing the behaviour. I’ll ponder some more …
Ah! I originally missed that you were doing it via the script, but the answer is the same: you inadvertently removed the Stackdriver scopes, so the instance is no longer able to write to it. See https://cloud.google.com/sdk/gcloud/reference/compute/instances/create#–scopes , which notes,
If not provided, the instance will be assigned the default scopes, described below.So when you do specify it, it is not additive to the default, it completely overrides it. You need to specify both GCS and SD access in the scopes, via the command line. Hope this helps! 🙂
Yes it does! Thank you so much!
Awesome! No problem! 🙂