Spark Cassandra Connection FrameTooLongException

72 views Asked by At
data = spark_session.read \
            .format('org.apache.spark.sql.cassandra') \
            .options(table=table, keyspace=keyspace) \
            .load()
data= data.cache()
print(data.count())

Caused by: com.datastax.driver.core.exceptions.FrameTooLongException: Response frame exceeded maximum allowed length

I am getting the following error while connecting from spark 2.4.7 with pyspark as the programming language, for connection with cassandra I am using spark-cassandra-connector-2.3.2_patched.jar.

I don't have access to source system to change the cassandra.yaml configuration native_transport_max_frame_size_in_mb as mentioned here

Is there any way to override Cassandra's configuration native_transport_max_frame_size_in_mb while reading from cassandra using spark.conf.set?

0

There are 0 answers