I added this to <my_project_name>/project/plugins.sbt:
resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"
addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")
in order to import sbt-spark-packages, but sbt tell me "Extracting structure failed: Build status: Error".
I tried with other plugin but the behavior is always the same.
sbt version: 1.8.2
scala version: 2.13.10
See the ticket
dl.bintray.com/spark-packages/maven is forbidden https://github.com/databricks/sbt-spark-package/issues/50
https://spark-packages.org/package/databricks/sbt-spark-package
Do
Now you can find a JAR at
sbt-spark-package/target/scala-2.10/sbt-0.13/sbt-spark-package-0.2.6.jar.Do
sbt publishLocaland it will be published at~/.ivy2/local/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.6/jars/sbt-spark-package.jar.Now you can use this sbt plugin in your project:
build.sbt
project/build.properties
project/plugins.sbt
Please notice that
sbt-spark-packageis a plugin to sbt 0.13.x, not sbt 1.xSupport SBT 1.x https://github.com/databricks/sbt-spark-package/issues/40
In order to use the plugin with sbt 1.8.2 and Scala 2.13.10 you'll have to upgrade it yourself.
Moreover,
sbt-spark-packageseems to be outdated, abandoned, deprecatedjava.lang.NoSuchMethodError: sbt.UpdateConfiguration.copy$default$1()Lscala/Option https://github.com/databricks/sbt-spark-package/issues/51
Is this plugin deprecated? https://github.com/databricks/sbt-spark-package/issues/48