A dataframe won't save as a table , be it permanent or temp - no error message is given.
I have tried :
df.createOrReplaceTempView("tmp_table")
# or
df.createGlobalTempView("tmp_table")
# or
df.writeTo("tmp_table")
I try to check the table using:
spark.sql("SHOW TABLES like 'tmp_table'")
# or
spark.sql("SHOW TABLES like 'global_temp.tmp_table'") # for global view
Nothing is displayed, there is not such view or table.
PySpark Version :3.1.2
Any ideas what could be at fault here?
Later Edit: this problem is happening when using spark streaming - it is not replicated in batch
Later Later Edit: Based on some logic change in the script, and because I need to process the json data, if I run this ( first time i use spark context in the script )
df_schema = spark.read.json(df.rdd.map(lambda x: x.data)).schema
the createOrReplaceTempView doesn't work.
If I don't try to do the schema logic and just run this first time with spark context
df = spark.read.json(another_df.rdd.map(lambda x: x.body))
The views are created properly. Could it be a spark context thing? Thanks