how to solve Yaml file size issue in spark jobs because of version conflict using sbt

55 views Asked by At

Our application is in scala and it is using abt assembly to build the project .

We are using moultingyaml "0.4.1 is our project which is wrapper of snakeyaml 1.26 for Scala projects

import net.jcazevedo.moultingyaml._

But in dataproc cluster master and worker node there is snakeyaml2.0 version is present which is creating conflict and code point limit issue( you'll find that setCodePointLimit was introduced in 1.32 and further)

sudo find / -iname \*snakeyaml\* 2> /dev/null
/usr/lib/spark/jars/snakeyaml-2.0.jar
/usr/lib/hadoop-yarn/lib/snakeyaml-2.0.jar

Can you please suggest how can we resolve this version conflict so it should use snakeyaml 1.26 version only in runtime of our code

0

There are 0 answers