In HiveThriftServer2 class, what is the difference between calling the startWithContext vs calling the main?
I have a customer UDF jar that I want to register, so that every time when the thrift server boots up, all these are auto configure. Is there a way to do this?
Can I use Hive context to register the UDF jar and functions and call the HiveThriftServer2.startWithContext to start up the server?
Thanks
What you are looking for is called
hive.aux.jars.path
, and it's a Hive property, not Spark specific.I personally haven't tried it, but I'm thinking something like this
References