Using spark jars using databricks-connect>=13.0

31 views Asked by At

With the newest version of databricks-connect, I cannot configure the extra jars I want to use. In the older version, I did that via

spark = SparkSession.builder.appName('DataFrame').\
        config('spark.jars.packages','org.apache.spark:spark-avro_2.12:3.3.0').getOrCreate()

How can I configure this with databricks-connect>=13.0?

0

There are 0 answers