Creating partitioned table in postgres via Spark JDBC write

190 views Asked by At

I want to write a dataframe to a postgres table via spark jdbc connector. The table i am writing to in postgres needs to be partitioned by a certain column. This is currently how i am writing it. I am running spark 3.2.3 and postgres 11

val username = "myuser"
val password = "password"
val url = "jdbc:postgresql://localhost:5432/mydb"

val connectionProperties = new Properties()
connectionProperties.put("user", username)
connectionProperties.put("password",password)
connectionProperties.put("partitionColumn","login_date")
connectionProperties.put("numPartitions","1")
connectionProperties.put("upperBound","2022 02-10 00:00:00")
connectionProperties.put("lowerBound","2022-02-09 00:00:00")

df.write.mode("append").jdbc(url,"pgloader",connectionProperties)

Initially there is no table and i run this code. It creates a table but there are no partitions coming in it. This is how the table looks when i describe it.

enter image description here

The column login_date has not been range partitioned in postgres table even though i specified it in the connection properties.

How can i create a postgres table with partitions via spark jdbc?

0

There are 0 answers