Google Certified Associate Cloud Engineer 2020

Sign Up Free or Log In to participate!

GCS and GCE challenge lab

In the GCS and GCE challenge lab, I created a project using GCP console and then created a bucket using ‘gsutil’ command. Then I copied the start-up script into this bucket. I added the zone and region to the config setting. Then I used the following command to spin up the GCE with the custom metadata, start script and appropriate scope.

gcloud compute instances create lab-challenge-vm –machine-type f1-micro –scopes storage-rw –metadata lab-logs-bucket=gs://lab-challenge-log/,startup-script-url=gs://lab-challenge-log/worker-startup-script.sh

This did work. GCE was created and CPU load was higher initially, a text file exist in storage bucket. I can see the logs in the stackdriver, but those logs are only few and I don’t see any logs related to startup script. I am little confused?

Do I need to include logging and monitoring read/write in the scopes? or (–scopes=default,storage-rw)

1 Answers

Good job! You’re making great progress! And you’re also on the right trail to wonder about the scopes.  I suggest you compare what gets set on a working instance you created via the UI against a CLI-created instance for which you don’t see startup logs sent by the Stackdriver agent.  (Another hint there. 😉)

Keep it up!

Mattias

Pavan Ponugoti

Thank you, that worked. So when I used –scopes=default,storage-rw in the cloud shell instead of just –scopes=storage-rw, I can see startup script logs generated in the stack driver that are similar to logs when I provisioned the VM using console. I can see the difference now why it wouldn’t work in the first place, and default need to explicitly mentioned in the scopes.

Mattias Andersson

Boom! 💥 You have just learned from experience and will not forget how these things work. 😁 Glad I could help.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?