How can I Convert 'pyspark.dbutils.DBUtils' to 'dbruntime.dbutils.DBUtils' in Databricks

641 views Asked by At

I am working on a project where we have some helper functions that uses dbutils and they were initially used as notebook but now they got converted to python modules. Now I cannot access those methods as they cannot find dbutils.

I searched for ways of using dbutils in a way that I can call that from notebook as well as python module and I got some stack overflow answers that suggest using below methods:

def get_db_utils(spark):

      dbutils = None
      
      if spark.conf.get("spark.databricks.service.client.enabled") == "true":
        print("Inside IDE Dbutils")
        from pyspark.dbutils import DBUtils
        dbutils = DBUtils(spark)
      
      else:
        print("Inside Notebook Dbutils")
        import IPython
        dbutils = IPython.get_ipython().user_ns["dbutils"]
      
      return dbutils


def get_dbutils(spark):
  from pyspark.dbutils import DBUtils
  return DBUtils(spark)

Whenever I check the type of the dbutils reference variable after calling these functions are as below:

dbutils1 = get_db_utils(spark)
dbutils2 = get_dbutils(spark)
print(type(dbutils1))
print(type(dbutils2)) 

It gives the output as <class 'pyspark.dbutils.DBUtils'> but whereas when print the type of the actual dbutils I get the output as <class 'dbruntime.dbutils.DBUtils'>

Now when I try to read the secret value using actual dbutils it runs and works properly. But whenever I used dbutils1 or dbutils2

secret_value = dbutils1.secrets.get(scope=SECRET_SCOPE, key="Key")

it gives me below error:

IllegalArgumentException: Invalid URI host: null (authority: null)

Is there any way I can get around this error?

0

There are 0 answers