I am able to execute SparkPi in k8s and deployed (in GKE) as well.
But,
when I am trying to broadcast PI value to my microservice which is in toys-broadcast-svc.toys.svc.cluster.local
I am unable to resolve DNS (getting UnknownHostException) . Can anyone help? Am I missing something here?
For your information:
I have Installed the operator with helm
helm install sparkoperator incubator/sparkoperator --namespace toys-spark-operator --set sparkJobNamespace=toys-spark,enableWebhook=true
I am using spark-operator (microservice are in namespace called
toys
and spark is in namespace calledtoys-spark
)
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: spark-pi
namespace: toys-spark #apps namespace
spec:
type: Java
mode: cluster
image: toysindia/spark:3.0.1
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.12-3.0.1.jar
sparkVersion: 3.0.1
restartPolicy:
type: Never
volumes:
- name: "toys-spark-test-volume-driver"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
type: Directory
- name: "toys-spark-test-volume-executor"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
type: Directory
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 3.0.1
serviceAccount: spark
volumeMounts:
- name: "toys-spark-test-volume-driver"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 3.0.1
volumeMounts:
- name: "toys-spark-test-volume-executor"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
sparkConf:
spark.eventLog.dir:
spark.eventLog.enabled: "true"
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark-operator
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark #apps namespace
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: spark
namespace: toys-spark #apps namespace
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: spark-operator-role
namespace: toys-spark #apps namespace
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: edit
subjects:
- kind: ServiceAccount
name: spark
namespace: toys-spark #apps namespace