Just finished the challenge lab from console and aswell as from GCS. I want to share with you and other before I will see the answer:
I did not do error checking or any validation, it was just quick and dirty script to complete the lab, not to test my shell scripting abilities :). I thought it could be straight forward but had to do research on how to give full scope from shell and link billing project.
Your feedback will be appreciated !!
#!/bin/bash PROJECT=acloudg-lab4-xxxx INSTANCE=gk-inst-6 BUCKET=challenge-bucket-5 SCRIPT=https://raw.githubusercontent.com/ACloudGuru/gcp-cloud-engineer/master/compute-labs/worker-startup-script.sh gcloud projects create $PROJECT gcloud alpha billing accounts projects link $PROJECT --account-id xxxxx-xxxxx-xxxxx gcloud config set project $PROJECT gsutil mb gs://$BUCKET gcloud services enable compute.googleapis.com gcloud --project $PROJECT compute instances create $INSTANCE --zone us-central1-a --metadata=startup-script-url=$SCRIPT,lab-logs-bucket=gs://$BUCKET/ --scopes https://www.googleapis.com/auth/cloud-platform
just a comment – gsutil mb would require permissions to create a bucket. therefore a service account to be created that has IAM permissions to do so
Looking pretty good, Krishna! 👍 Thanks for posting! 😁
As you found, there are a few things to consider, here–like linking billing, enabling the API, and setting scopes. I hope you feel like this was a good way to learn about them, hands-on.
gsutil mb command not need you to specify a location? Or were you meaning to enter that part interactively? Anyway, when you’re happy with your script, read through some other scripts and discussions to get more ideas of things to try–such as maybe scoping down the scopes to the bare minimum required. 🙂
Edit: Silly me…
If you don't specify a -l option, the bucket is created in the default location (US). I guess I always just specify the location! 😂
Man Mattias, the data flow lesson is excellent. It makes everything clear. before watching it I struggled to get the log file created in S3. All because of the difference between permissions in scope. The minute I saw the token explanation, it took me 10 seconds to fix the problem and see the log file in the storage