How to Read a large avro file

653 views Asked by At

I am trying to read a large avro file (2GB) using spark-shell but I am getting stackoverflow error.

val newDataDF = spark.read.format("com.databricks.spark.avro").load("abc.avro")
java.lang.StackOverflowError
  at com.databricks.spark.avro.SchemaConverters$.toSqlType(SchemaConverters.scala:71)
  at com.databricks.spark.avro.SchemaConverters$.toSqlType(SchemaConverters.scala:81)

I tried to increase driver memory and executor memory but I am still getting same error.

./bin/spark-shell --packages com.databricks:spark-avro_2.11:3.1.0 --driver-memory 8G --executor-memory 8G

How can I read this file ? Is theere a way to partition this file?

0

There are 0 answers