Is it possible to enable communication between Rancher and EKS nodes using internal IP addresses?

98 views Asked by At

My Rancher and EKS clusters are both in the AWS private network segment, and they could communicate with each other using internal IP addresses. However, when I imported the EKS cluster, there were no configurable options related to this, and communication was defaulting to using public IP addresses.

I would like to know if there is a solution to enable communication between Rancher and the EKS cluster using internal IP addresses within the same network segment, while still retaining their public IP addresses.

Thank you very much.

1

There are 1 answers

0
Slickwarren On

Imported clusters work a bit differently than rancher-provisioned clusters. On the rancher side, registration happens based on the global setting registration-url. This is used for all nodes, and is set to wherever all nodes you expect to register can reach. Most of the time, that's the public address of rancher. However, you can change this if you wanted all clusters connected to rancher to use its private IP. Again, this is a global level setting so it would affect all clusters, not just the imported one in question.

I think there's logic in rancher's provisioned clusters that will use private IPs if both public and private are set, but again I'm unsure of the exact behavior for imported clusters since there's no specific setting for it when importing.

Depending on your reasoning for wanting to use the private IPs, you may look into an airgap install.