Multiple partition in Pyspark creates multiple sessions in Teradata while writing a Dataframe, causing deadlock (Via JDBC data source)

65 views Asked by At

Spark dataframe with multiple partition (4) creates multiple sessions in teradata when I perform write operation due to which causing the deadlock.

May i know please, is it possible to perform write operation with multiple concurrent sessions without causing the deadlock in Teradata?

I tried to write the spark dataframe (having 4 partition) into teradata, which created 5 multiple sessions [1 session ("Serializable" Isolation Level) & 4 sessions ("Read Uncommitted" Isolation Level)], failed with deadlock occurrence.

I would like to know, can I possibly write spark dataframe with multiple concurrent sessions in teradata without causing the deadlock issue?

0

There are 0 answers