Spark submit repo via a remote repository

39 views Asked by At

I am trying to submit an artifact on Jfrog via https url but I am getting

401 bad credentials

Even tried the submitting with url: https://<username>:<password>jfrog.io/artifactory/spark-test/1.0.0-SNAPSHOT/jarName.jar as mentioned in official documentation but still the same error

{
    "action": "CreateSubmissionRequest",
    "appResource": "https://jfrog.io/artifactory/spark-test/1.0.0-SNAPSHOT/jarName.jar",
    "clientSparkVersion": "2.4.1",
    "environmentVariables": {
        "SPARK_ENV_LOADED": 1
    },
    "mainClass": "SparkJobProcessor",
    "appArgs": [
        "sparkconf.spark.master=\"spark://spark-master.local:7077\"",
        "sparkconf.spark.kryoserializer.buffer.max=\"128m\"",
        "bucket=\"-stag\"",
        "sparkconf.spark.app.name=\"psl-matchlock-pc-streaming_R60705_M63487\""
    ],
    "sparkProperties": {
        "spark.jars": "https://jfrog.io/artifactory/spark-test/1.0.0-SNAPSHOT/jarName.jar",
        "spark.driver.supervise": "false",
        "spark.app.name": "abcd",
        "spark.eventLog.enabled": "false",
        "spark.submit.deployMode": "cluster",
        "spark.master": "spark://spark-master.local:7077",
        "spark.executor.memory": "2GB",
        "spark.driver.memory": "2GB",
        "spark.executor.cores": "4",
        "spark.driver.maxResultSize": "40GB",
        "spark.driver.cores": "4",
        "spark.cores.max": "20",
        "env": "stag"
    }
}

Is there anything that needs to be added on jfrog ? I couldn't find any official documentation example to access a remote repository. I am sibmitted to a standalone spark cluster running on EC2 instances. I tried to hit the url via curl and passing username and password from the EC2 instance and I was able to download the artifact but somehow when doing it via spark submit it is not working

0

There are 0 answers