(spark jdbc) SQLRecoverableException: I/O Exception: Connection reset

99 views Asked by At

I have been working on a request that extracts data from an ORACLE 19c instance, and that is processed using aws emr-serveless using spark jdbc connections.

The big picture is that I can't connect to the database because of the error below: 19.15.0.0.1.jar, with Java 17, and I always get an error:

java.sql.SQLRecoverableException: IO Error: Connection reset, Authentication lapse 0 ms.

I've been stuck in it for almost 2 months, and unfortunately I'm not getting past this block. I have an application on emr serveless aws emr-6.13.0, ojdbc10-19.15.0.0.1.jar, with Java 17, and I always get an error.

Here are a few things I've already tested:

  • older versions of emr serveless
  • jdbc8, jdbc10, jdbc11, even compiled in specific versions for oracle 19c.
  • I've used Java 8 and Java 17 as runtime
  • Using pure python, I can connect to the database with Lambda and also as EC2 (I did the test to ensure that other aws services can connect using the necessary VPC).
  • I have ensured that I have network connections, as I can ping, nslookup, and tellnet the destination.

After all, I don't know what to do about this mistake and I don't have any ideas that might be useful.

I'll also make my sparkSubmitParameters available, these are the settings I'm using at the moment.:

--conf spark.driver.extraJavaOptions=-Djavax.net.debug=all 
--conf spark.executor.extraJavaOptions=-Djavax.net.debug=all 
--conf spark.jars=s3://xxxxxxx/emr-serverless/ojdbc10-19.15.0.0.1.jar 
--conf spark.driver.extraJavaOptions=-Djava.security.egd=file:/dev/./urandom 
--conf spark.executor.extraJavaOptions=-Djava.security.egd=file:/dev/./urandom 
--conf spark.driver.extraJavaOptions=-Dsecurerandom.source=file:/dev/./urandom 
--conf spark.executor.extraJavaOptions=-Dsecurerandom.source=file:/dev/./urandom 
--conf spark.dynamicAllocation.enabled=true  
--conf spark.dynamicAllocation.shuffleTracking.enabled=true 
--conf spark.dynamicAllocation.minExecutors=1 
--conf spark.dynamicAllocation.maxExecutors=30 
--conf spark.driver.cores=8 
--conf spark.executor.cores=8 
--conf spark.driver.memory=52g 
--conf spark.executor.memory=52g 
--conf spark.emr-serverless.driver.disk=200g 
--conf spark.emr-serverless.executor.disk=200g 
--conf spark.emr-serverless.driverEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64/ 
--conf spark.executorEnv.JAVA_HOME=/usr/lib/jvm/java-17-amazon-corretto.x86_64/

Can anyone help me? Any help is welcome.

0

There are 0 answers