We set up GCE instances using terraform and then use ansible-playbooks in order to provision them and get our services onto the machines.
I'm running a project in our organisation which needs to pull a docker image from a different project. The images are hosted in a container registry in that other project.
My ideal sequence of events would be:
- Create a GCE in my project using terraform with properly configured service-accounts.
- Use ansible to install docker on the GCE.
- Use ansible module
docker_containerto pull the necessary image I want from the container registry.
This seemingly simple workflow is not trivial. At first, I discovered that just running docker_container fails since docker needs to be authenticated first. Given that I don't want to login to the machine and set it up with the credential helper etc., the only way I have is to try and run the command
docker login -u _json_key -p<jsonkeyfile> http://gcr.io
I can get this to run directly on the command line if i login to the machine in question but trying to get it to run using ansible docker-login is giving me nightmares separate question so I want to avoid it altogether.
The GCE instance is created with a dedicated service account pre-configured during creation (with terraform). All the roles have been granted to the account as I can login and pull images if i use the service account key from the command line.
What I really expect is that in step 1 above, if I use a google service account with all the proper credentials, the GCE instance should already be set up to talk to the container registry. Is there a way to make this work purely as part of the startup configuration? I looked into https://cloud.google.com/container-optimized-os/docs/ but I don't want to go with chromeOS yet, besides I don't know even if that will be set up out of the box although it feels so from reading the documentation.
Is there a way to pre-setup a docker ready GCE instance? If not, has anyone tried out an ansible based workflow using docker login and got it to work?