AWS Elastic MapReduce is capable of a lot, but it has some rough edges that I'd like to sidestep for some fairly inexpensive computation I'd like to do in Apache Spark. Specifically, I'd like to see if I can get a (scala) spark application running on AWS ECS/Fargate. If I can get it to work with just one container running in client/local mode, all the better.

I started by making a distribution of Spark with the hadoop3 (for AWS STS support) and kubernetes profiles selected:

# in apache/spark git repository under tag v2.4.0
./dev/make-distribution.sh --name hadoop3-kubernetes -Phadoop-3.1 -Pkubernetes -T4

And then building a generic spark docker image from within that distribution:

docker build -t spark:2.4.0-hadoop3.1 -f kubernetes/dockerfiles/spark/Dockerfile .

Then within my project, I built another docker image on tope, with my sbt-assembled uberjar copied into the working directory, and setting the entrypoint as the spark-submit shell script.

# Dockerfile
FROM spark:2.4.0-hadoop3.1
COPY target/scala-2.11/my-spark-assembly.jar .
ENTRYPOINT [ "/opt/spark/bin/spark-submit" ]

On my local machine, I can run that application by supplying the appropriate arguments in a docker-compose command specification:

# docker-compose.yml
...
   command:
     - --master
     - local[*]
     - --deploy-mode
     - client
     - my-spark-assembly.jar

Unfortunately, in Fargate ECS I get a quick failure with the following stacktrace written to CloudWatch:

Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:714)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf$$anonfun$getOption$1.apply(SparkConf.scala:388)
at org.apache.spark.SparkConf$$anonfun$getOption$1.apply(SparkConf.scala:388)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.SparkConf.getOption(SparkConf.scala:388)
at org.apache.spark.SparkConf.get(SparkConf.scala:250)
at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopConfigurations(SparkHadoopUtil.scala:463)
at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:436)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:334)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:334)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:334)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: c0d66fa49434: c0d66fa49434: Name does not resolve
at java.net.InetAddress.getLocalHost(InetAddress.java:1506)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:946)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:939)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:939)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localCanonicalHostName(Utils.scala:996)
at org.apache.spark.internal.config.package$.<init>(package.scala:296)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
... 18 more
Caused by: java.net.UnknownHostException: c0d66fa49434: Name does not resolve
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:929)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1324)
at java.net.InetAddress.getLocalHost(InetAddress.java:1501)
... 27 more

Has anyone out there had any success with a similar attempt?

0 Answers