How to create custom Spark-Native functions without forking/modifying Spark itself

1k views Asked by At

I am looking into converting some UDFs/UDAFs to Spark-Native functions to leverage Catalyst and codegen.

Looking through some examples (for example: https://github.com/apache/spark/pull/7214/files for Levenshtein) it seems like we need to add these functions to the Spark framework itself (i.e. via FunctionRegistry.scala).

Is there a way to add custom Spark-Native functions in "userspace" i.e. without forking/modifying the actual Spark codebase?

Thank you!

0

There are 0 answers