EKS clusters with only public nodes using eksctl

55 views Asked by At

I'm a new starter to EKS. I want to use eksctl to create a cluster with only public nodes. I understand it's not a best practice. I'm testing EKS and don't really need to have private subnets/NAT gateways. My cluster_config.yaml is below.

apiVersion: eksctl.io/v1alpha5
kind: ClusterConfig
metadata:
  name: eks-test
  region: us-east-2
  version: "1.23"

vpc:
  subnets:
    public:
          us-east-2a: { id: subnet-066943323aea5b44b }
          us-east-2b: { id: subnet-0513559512aad266a }
          us-east-2c: { id: subnet-0f718da8d4f83ccb7 }
    
nodeGroups:
  - name: eks-test-workers
    minSize: 1
    maxSize: 1
    desiredCapacity: 1
    instanceType: t2.small
    labels: {role: worker}
    ssh:
      publicKeyName: ec2_key
    tags:
      nodegroup-role: worker
    iam:
      withAddonPolicies:
        externalDNS: true
        certManager: true
        albIngress: true

My cluster was created successfully with this command

eksctl create cluster -f cluster_config.yaml

However there is no node created. In EC2 console, I can see an EC instance was created and terminated at the same time. Could someone pls help me to understand this?

1

There are 1 answers

0
TrongBang On BEST ANSWER

In my case, I missed to create the associations for my public subnets to Internet gateway. It was using the default route table which doesn't have any outbound traffic