1 Answers
Constantin, good for you on working through the challenge lab and reaching out. 👍
And I’m glad you were able to see that things had progressed to the point of sending logs to Stackdriver. That tells you a lot, and it confirms that how you entered the startup script was indeed correct.
To debug the problem further, I suggest the following:
Start by refreshing the bucket view in the console–and maybe even doing a
gsutil ls
of it–to make sure that the file is really not there.Next, read all the log lines in Stackdriver logs that refer to the gsutil command. The startup script I made should be logging quite verbosely, so you may be able to tell what’s going wrong by either the gsutil command being run looking wrong or it returning an error message.
If that doesn’t identify the problem, then SSH into the instance and paste in each line from the "Report that we’re done" section a the end of the script, one line at a time. Take a look at the commands you wind up with and any error messages you get back. (You can
echo
any line you are about to run, to see exactly what it’ll be.)If you’ve tried that for a while and are still not getting through the block, then watch just the first couple of minutes of the dataflow lecture, where I list the actions you need to take and make sure that you’ve done them all.
Failing that, then ROT13 the following for additional hints:
Znxr fher lbh'ir frg gur fpbcrf gb nyybj jevgrf gb fgbentr. Naq znxr fher gung gur zrgnqngn nggevohgr lbh'ir frg unf n anzr/xrl bs "yno-ybtf-ohpxrg" naq n inyhr bs "tf://ybttvat-ohpxrg-grfg3/" (nffhzvat gur ohpxrg lbh perngrq vf pnyyrq "ybttvat-ohpxrg-grfg3").
😀
Finally, I’m currently editing the video where I walk through all the steps–so you should be able to watch and mimic that, soon enough. 🙂
Update: OK, that demo of this challenge lab is now live, so that’s another option for double-checking that you’re doing it correctly. But if you have the time, debugging it yourself, first, can result in strong learning! 🙂
Mattias
Hi Mattias, I was just confused. Watching your demo, made me realized that I did everything correctly. And my outcome is exactly as planned. I thought next to the file saying "Phew we’re done" I thought there must be an export from the stackdriver logs as well in the bucket. I assumed it before, but listening to the instruction and also to the Dataflow video, I was still the impression I should see the logs in the bucket and not just the "finished" notification. Maybe others will find it a bit confusing as well. Thanks for the demo and your help although it was actually not needed at the end, just clarified the misunderstanding. Thanks a lot!
Ah! I’m glad you got it all figured out–and that you had it all right, all along! 😄 I also appreciate you letting me know what the confusion was so I can try to avoid that in the future.