In the demo of the lab’s solution, I see that the correct setting for Scopes is, "Set access for each API", and then the sub-selection of Storage = "Write Only". When I choose this option, the lab works. Yay!
Before seeing the demo, I had selected another value for Scopes: "Allow full access to all Cloud APIs". I assumed this would give me Read and Write access to all APIs and thus would be guaranteed to work. But… when I set this value, the text file never appears in my GCS Bucket. The other two items look OK: (1) CPU Graph and (b) Stackdriver Logs. Why doesn’t the text file appear? Does the value of "Allow full access to all Cloud APIs" not give write-access to the Storage cloud?
Hello, Steve! The approach I showed–of setting the specific scopes needed–is the preferred approach because it follows the principle of Least Privilege, but what you did (scope set to "Allow full access to all Cloud APIs") should really have worked, too!
I think maybe you unknowingly tweaked some other small thing that affected it, too, between those two tries. My suggestion to you is to take the (an) instance that you have seen work and then do "Create Similar" to make another copy of it. If you want to be very careful in your testing, you can first change nothing and confirm that this setup still does work. But then "Create Similar" another one and change only the scopes from the specific ones to the "Allow full access to all Cloud APIs" one. I know it may seem like going backwards, but it’s all for learning! 🙂 If this change really does then stop it from working (which would surprise me and would indeed be interesting!), then you should try to debug that broken instance.
To debug it, try looking for the easiest info, first, of how far it got: check the bucket (which is presumably the broken thing) and the Stackdriver logs. If there are logs, look through them for the startup script lines that try to write the file to the bucket. Compare the Stackdriver log lines between the instance that is working with the Stackdriver log lines from the one that isn’t working. This really should show something about why it failed. The first log line that is materially different between the two is highly suspect, even if it isn’t explicitly an error.
Now, if you happen to know which instance was previously having a problem and prompted this post, then do the same thing there, too: compare the log lines to try to figure out why it was having a problem. Was it really the scopes, or was something else different–like the service account used, or the bucket name in the metadata, or something?
I hope this helps you figure out this issue and move your learning forward. I’m curious to hear back about what you find!
I created the new proyect but when I try to create a bucket, I’m not able to do that, I getting
You can use Cloud Storage after you enable billing
Pay only for what you use. Learn more about Cloud Storage pricing.
Would you help me please?
Hello Andres. You have probably hit your billing account linking quota. Rewatch the "Milestone: Open World" lecture for some more tips on this: https://acloud.guru/course/gcp-certified-associate-cloud-engineer/learn/account-setup/milestone-open-world/watch And if that doesn’t work, please post a new question describing your situation (instead of continuing this conversation via answers or comments on an unrelated question).
I did this too but via the command line, –scopes=https://www.googleapis.com/auth/cloud-platform, and found it was my impatience that was the problem, it took between 2 – 3 minutes for the whole startup process to complete. Checking the stackdriver logs you should see no errors after the gsutil command to copy the file, but the process won’t be complete until those logs show, if it says the file was copied ok and nothing is showing up then there may be an error in your metadata..