Google Certified Associate Cloud Engineer 2020

Sign Up Free or Log In to participate!

GCS & GCE Challenge Lab : Not able to see the log file in the bucket

Not able to see the log file in the bucket but can see that the log file was create and copied to the bucket in the logs

Logs:

I 2020-08-13T01:09:09.051809608Z startup-script: + worker_log_file=machine-mychallengevm-new-finished.txt

I 2020-08-13T01:09:09.052876104Z startup-script: ++ date

I 2020-08-13T01:09:09.054678156Z startup-script: + echo ‘Phew! Work completed at Thu Aug 13 01:09:09 UTC 2020’

I 2020-08-13T01:09:09.054711848Z startup-script: + echo ‘Copying the log file to the bucket…’

I 2020-08-13T01:09:09.054730106Z startup-script: Copying the log file to the bucket…

I 2020-08-13T01:09:09.054746610Z startup-script: + gsutil cp machine-mychallengevm-new-finished.txt gs://challenge-lab-vm/

Appreciate any help

Thanks

2 Answers

I had the same issue. In Logs Viewer if found the following messages:

2020-08-13 12:01:25.000 CEST Aug 13 10:01:25 challange-lab-instance-1 GCEMetadataScripts[643]: 2020/08/13 10:01:25 GCEMetadataScripts: startup-script: AccessDeniedException: 403 Insufficient Permission Expand all | Collapse all{ insertId: "ej6n2w0b90nuyf0g0" labels: {…} logName: "projects/challenge-lab-project-286308/logs/syslog" receiveTimestamp: "2020-08-13T10:01:27.001543981Z" resource: {…} textPayload: "Aug 13 10:01:25 challange-lab-instance-1 GCEMetadataScripts[643]: 2020/08/13 10:01:25 GCEMetadataScripts: startup-script: AccessDeniedException: 403 Insufficient Permission" timestamp: "2020-08-13T10:01:25Z" }

I checked the API permissions in the instance and had a wrong setting for ‘Storage’. All Stackdriver settings where Write Only. Which is correct. But for Storage I had ‘Read Only’. After changing that to ‘Write Only’ everything worked.

Maybe it’s worth checking if it was the same in your case.

Rodrigo Sandoval

Good catch. I faced the same and solved it on the same way.

CloudBadass

The question is how to fix this without using the console at all.

This is what I wund up doing without touching the console at all today .

+++++++++++++++++

LAB GCS GCE:

+++++++++++++++++

1- Create New Project

$ gcloud projects create gcs-gce-project-lab --name="GCS and GCE LAB" --labels=type=lab  
$ cloud config set project gcs-gce-project-lab

2- Link project with billing account

$ gcloud beta billing accounts list

ACCOUNT_ID                    NAME     OPEN            MASTER_ACCOUNT_ID

--------------------------------------------------------------------------------

0X0X0X-0X0X0X-0X0X0X  MyAccount  Billing account        True
$ gcloud alpha billing accounts projects link gcs-gce-project-lab --billing-account=0X0X0X-0X0X0X-0X0X0X
OR    
$ gcloud beta billing projects link gcs-gce-project-lab --billing-account=0X0X0X-0X0X0X-0X0X0X

3- ENABLE APIs

$ gcloud services enable compute.googleapis.com  
$ gcloud services enable computescanning.googleapis.com

4- Set default zone

$ gcloud compute project-info add-metadata –metadata google-compute-default-region=us-east1,google-compute-default-zone=us-east1-b –project gcs-gce-project-lab

— in CLoudShell

$ gcloud config set compute/zone us-east1-b  
$ gcloud config set compute/region us-east1

5- Service account :

——————

FORMAT: PROJECT_NUMBER-compute@developer.gserviceaccount.com

$ gcloud projects describe gcs-gce-project-lab |grep projectNumber

projectNumber: ‘521829558627’

=> 221829558627-compute@developer.gserviceaccount.com

6 – Download the script in CloudSHell

$ wget https://raw.githubusercontent.com/ACloudGuru/gcp-cloud-engineer/master/compute-labs/worker-startup-script.sh

7- Create a bucket

gsutil mb -l us-east1 -p gcs-gce-project-lab gs://gcs-gce-bucket

8- Create GCE instance running provided script: including storage read write permission :

_SYNTAX> $_gcloud compute instances create [INSTANCE_NAME] –service-account [SERVICE_ACCOUNT_EMAIL] –scopes [SCOPES,…]

$ gcloud compute instances create gcs-gce-vm --metadata lab-logs-bucket=gs://gcs-gce-bucket --metadata-from-file 
startup-script=./worker-startup-script.sh --machine-type=f1-micro --image-family debian-10  
 --image-project debian-cloud --service-account 521829558627-compute@developer.gserviceaccount.com 
 --scopes storage-rw,logging-write,monitoring-write,logging-write,pubsub,service-management,service-control,trace
NAME          ZONE         MACHINE_TYPE PREEMPTIBLE INTERNAL_IP  EXTERNAL_IP    STATUS

----------------------------------------------------------------------------------------

gcs-gce-vm us-east1-b      f1-micro               10.100.0.1    34.72.95.120    RUNNING
$ gcloud compute instances describe gcs-gce-vm --format json |grep storage  
"https://www.googleapis.com/auth/devstorage.read_write"
Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?