How to allow access to external IP from kubernetes cluster?

242 views Asked by At

I installed kubernetes and docker desktop on a local linux VM for testing purposes.

Also, I deployed an API which can receive data from the client and do an insert command to a SQL database that is accessible on a local host machine network which is outside the kubernetes cluster.

When I try to post data to the API, I'm getting an error "unable to connect SQL server". I used my SQL server IP in the connection string.

How can I allow access to external IP from kubernetes cluster ?.

When I try to ping to the SQL server IP from one of the pods it says "no route to host", but I am able to ping google.

I have tried methods suggested in these link but it doesn't work.

How to use a host IP from a container?

1

There are 1 answers

0
Mandraenke On

First, ask such questions on serverfault instead.

Second, use the search function: How to access host's localhost from inside kubernetes cluster

Your need to understand the networking of kubernetes and your local setup. From inside your cluster you can't access localhost directly as all traffic passed through the chosen network overlay in use. So you will need some routing of traffic to your local machine. Things become difficult when the database for example is running inside a docker container on the host. For a more detailed answer more information about your setup is needed.

My laptop running Kubernetes on Docker Desktop:

On the host, some fake service listening on port 8888:

nc -l 8888

From inside the cluster, localhost does not work, while the host ip address works:

root@ubuntu:/# telnet 127.0.0.1 8888
Trying 127.0.0.1...
telnet: Unable to connect to remote host: Connection refused
root@ubuntu:/# telnet 192.168.19.131 8888
Trying 192.168.19.131...
Connected to 192.168.19.131.
Escape character is '^]'.
^]
telnet> q
Connection closed.
root@ubuntu:/# 

So check routing :)