0

Let's say that you have a kubernetes cluster with a few nodes in a project in google cloud, and you have one separate instance in that project that all the nodes should have access to.

How do you go about granting access to this instance dynamically from the nodes? This is provided you are using the external ip address, not internal and not dependent on the ips given to the nodes.

I have tried doing this using tags and service accounts within the firewall rules without success. If anyone know of a better or more elegant way that actually works I would love to read it.

Ulukai
  • 829
  • 2
  • 10
  • 28
  • Can you elaborate more on your use case? By _"granting access"_, do you mean SSH can be done from the Kubernetes cluster nodes to the separate instance, or something else? What do you mean by _"dynamically access"_ , does the access needs to revoked after certain time? – Taher Feb 01 '18 at 00:58
  • ssh or any other port for that matter. Not necessarily, what I meant by dynamic is in case the ips for the nodes change. I tried doing this by tags and service accounts, but I think this only applies to internal ips, not external – Ulukai Feb 01 '18 at 09:23
  • I have connected to my GKE cluster nodes via ssh and from those nodes tried to ssh to a _separate_ Redhat Linux instance which was **in the same project** using `ssh username@external IP` command, and I was successful in doing so without any configuration change. Is this what you are talking about or there is more in your scenario? – Taher Feb 02 '18 at 04:49

0 Answers0