I am coding Python in Databricks and I am using spark 2.4.5.
I have several notebooks for loading my dimention tables and my fact tables. I have two master notebooks for loading Dimensions and Facts.
I have developed some UDF for testing, auditing and loging. I need to add my UDF to each notebook. For now, I have below command on top of each notebook
%run ../Functions
But i am wondering if by doing that I am loading several times my UDFs when i am running all my notebooks!
I thought maybe I have to load juste one time my UDF in my Master Notebook but i don`t know what should i do in development time when i need to run my notebook seperatly.
Is it any way to verify on top of my notebooks if my UDFs are not loaded, run Functions notebook?
if (UDFs are not loaded) :
%run ../Functions