Copy txt file from Azure Files to Blob Storage using Databricks

791 views Asked by At

I want to read a file from Azure Files (which is succeeded by using ShareClient) and export this file to Azure blob storage.

First I mount the container in Databricks with the following code:

def mount(container, account_name):
    """Mount a container in blob storage"""
    mount_list = [_.mountPoint for _ in dbutils.fs.mounts()]
    mount_point = f"/mnt/{container}/"
    if mount_point not in mount_list:
        dbutils.fs.mount(
                source = f"wasbs://{container}@{account_name}.blob.core.windows.net/",
                mount_point = mount_point,
                extra_configs = {f"fs.azure.account.key.saweaaedwh.blob.core.windows.net":dbutils.secrets.get(scope = "KEY-WE-AAE-DWH", key = f"key-{account_name}")})
        print(f"Container {container} is successfully mounted")
    else:
        print("Container is already mounted")

When I want to upload a file with following code:

with open(f"/dbfs/mnt/datascience/spc/failed2/test.txt", "wb") as outfile:
    download_stream = file_client.download_file()
    outfile.write(download_stream.readall())

The following error message comes in:

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/mnt/datascience/spc/failed2/test.txt'

In order to create this directory I used this code:

dbutils.fs.mkdirs('/mnt/datascience/spc/failed2/test.txt')

The problem is that this also create 'empty' directories. Do you know how to create only directories if it contains at least 1 file??

enter image description here

0

There are 0 answers