By default I get pre-defined spark session object (spark). Which is not hive enabled. How can I get hive enabled spark session?
How to enable hive support for spark in notebook?
918 views Asked by aahmed At
1
By default I get pre-defined spark session object (spark). Which is not hive enabled. How can I get hive enabled spark session?
I know I'm late to answer this question yours. But I hope it will be useful for someone who's working on it.
If the spark-defaults file doesn't have the catalogImplemenetation property set, the default value in Toree SQL will be local meta store directory (Derby). You need to explicitly set this configuration to Hive in spark-defaults.conf file in the cluster like this:
Restart the kernel after saving the changes in this file.