Adding a workspace file to Shared Databricks workspace

331 views Asked by At

I'm trying to upload a python file as a workspace file to my Shared Databricks workspace. Using both the CLI and the REST API each Python file I upload turns into a notebook automatically, but I want it to be a workspace file (because I want to use the file as a Python module). Notebooks can unfortunately not be used as Python modules.

Is there any way to programmatically import Python workspace files?

On runtime version 13.3 LTS

I have tried to use the CLI and REST API to upload Python files, after which they turned into notebooks. I have then manually created the Python files through the UI which works fine.

1

There are 1 answers

1
Alex Ott On BEST ANSWER

You need to use AUTO as upload format, not SOURCE, etc. Here is a working example for the new Databricks CLI:

databricks workspace import --format AUTO --file add-cluster-logs.sh \
  '/InitScripts/add-cluster-logs.sh'

P.S. For Python files, check that it doesn't have

# Databricks notebook source

in the first line - otherwise it will be imported as notebook.