Google Certified Associate Cloud Engineer 2020

Sign Up Free or Log In to participate!

GCS and GCE Challenge Lab

Hello Mattias, 

Just finished the challenge lab from console and aswell as from GCS. I want to share with you and other before I will see the answer:

I did not do error checking or any validation, it was just quick and dirty script to complete the lab, not to test my shell scripting abilities :). I thought it could be straight forward but had to do research on how to give full scope from shell and link billing project.

Your feedback will be appreciated !!

-GK


#!/bin/bash  

PROJECT=acloudg-lab4-xxxx

INSTANCE=gk-inst-6

BUCKET=challenge-bucket-5

SCRIPT=https://raw.githubusercontent.com/ACloudGuru/gcp-cloud-engineer/master/compute-labs/worker-startup-script.sh

gcloud projects create $PROJECT

gcloud alpha billing accounts projects link $PROJECT --account-id xxxxx-xxxxx-xxxxx

gcloud config set project $PROJECT

gsutil mb gs://$BUCKET

gcloud services enable compute.googleapis.com

gcloud --project $PROJECT compute instances create $INSTANCE --zone us-central1-a --metadata=startup-script-url=$SCRIPT,lab-logs-bucket=gs://$BUCKET/ --scopes https://www.googleapis.com/auth/cloud-platform

3 Answers

just a comment – gsutil mb would require permissions to create a bucket. therefore a service account to be created that has IAM permissions to do so

Mattias Andersson

Ah, I’m glad you posted, Alex! 👍 You’re right that gsutil mb needs permission to create a bucket… but something to consider is where the Service Account is used: the Service Account will be used to reach out from the instance, and that’s not how the bucket gets created. See if watching the Data Flow lecture for this challenge lab helps clarify this for you.

Looking pretty good, Krishna! 👍 Thanks for posting! 😁

As you found, there are a few things to consider, here–like linking billing, enabling the API, and setting scopes. I hope you feel like this was a good way to learn about them, hands-on.

Did the gsutil mb command not need you to specify a location? Or were you meaning to enter that part interactively? Anyway, when you’re happy with your script, read through some other scripts and discussions to get more ideas of things to try–such as maybe scoping down the scopes to the bare minimum required. 🙂

Edit: Silly me…  If you don't specify a -l option, the bucket is created in the default location (US). I guess I always just specify the location! 😂

krishna.gadhiraju

Thanks for the feedback Mattias, gsutil mb didn’t ask me for a location, may be I might have set using gsutil confi before.

Mattias Andersson

Np… and I only just now realized that this command has a default location of the US multi-region! Mystery solved! 😆

krishna.gadhiraju

Yes I have created multi regional bucket 😀 very true. May be I need to mention for regional buckets.

Man Mattias, the data flow lesson is excellent. It makes everything clear. before watching it I struggled to get the log file created in S3. All because of the difference between permissions in scope. The minute I saw the token explanation, it took me 10 seconds to fix the problem and see the log file in the storage

Mattias Andersson

Rock on! That’s exactly how I hoped this would work, Srika! That you might struggle with it a bit and then maybe some hint helps it click for you. You will never forget this, now–and not because you worked hard to memorize it but because it has become real to you! 😁 I really appreciate you sharing your experience. Thank you.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?