Working through the lab I ran into an issue where gsutil would return 403 trying to copy the log file out. I did change the scopes on the VM configuration to storage read/write prior to this issue, in the end I had to explicitly add the VM service account to the bucket with legacyBucketReader and legacyBucketWriter roles in order for the file to copy correctly. I did not see this when reviewing the demo videos later.
Has google changed their permission handling since the video was made to require this extra step?
Here is a snip of the commands and json I needed to make things work when doing things through the shell:
gsutil mb -c multi_regional -l us gs://challenge-lab-bucket
gsutil iam get gs://challenge-lab-bucket > bucket_permissions.json
gcloud services enable compute.googleapis.com
gcloud iam service-accounts list
**Edit bucket_permissions.json – use service account from prev command
gsutil iam set bucket_permissions.json gs://challenge-lab-bucket
{
"bindings": [
{
"members": [
"projectEditor:challenge-lab-",
"projectOwner:challenge-lab-"
],
"role": "roles/storage.legacyBucketOwner"
},
{
"members": [
"projectViewer:challenge-lab-",
"serviceAccount:[email protected]"
],
"role": "roles/storage.legacyBucketReader"
},
{
"members": [
"serviceAccount:[email protected]"
],
"role": "roles/storage.legacyBucketWriter"
}
],
"etag": "CAE="
}