I've deployed an Azure function successfully, can pass data inputs into that function and operate on that data. My objective is to capture these inputs (table names of the datasets the user requires) and then download these datasets from Blob storage. For the download piece, I have several pieces of code which allows me to successfully download a given file from the Azure Datatalake (when I run that Python code locally), however when I place that code into the Azure function to execute, no download is initiated - I presume it might be because the Azure Function has no reference to the sink into which the file needs to be downloaded.

Is there any way to persist data to local disk when a SAS URL is constructed and triggered from an Azure Function?

%python

import logging
import azure.functions as func


def main(req: func.HttpRequest) -> func.HttpResponse:

    logging.info('The API has initialized and will execute now.')

    # Open the payload sent to this function
    req_body = req.get_json()

    # Save the data request format type
    # dataset_format = req.params.get('dataset_format')
    dataset_format = req_body.get('dataset_format')

    logging.info("********   - Checkpoint 1 -   **********")
    # dataset list passed in as parameter 
    datasets = req_body.get('datasets')
    dataset_1 = datasets[0]
    dataset_2 = datasets[1]
    dataset_3 = datasets[2]

    # Download Option 1 (preference - when executing the SAS URL from a browser, it shows the downloads tab at the bottom of the browser with the downloaded file/s)
    import webbrowser
    sas_url = "https://apptestdatalake.blob.core.windows.net/**filesystem_name**/**blob_name**/Iris.csv?**sas token**"
    webbrowser.open(sas_url)

   # Download Option 2 
   from azure.storage.blob import BlobClient
   download_file_path = "C:/Users/**user**/Downloads/Requested Downloads/"
   print("\nDownloading blob data to \n\t" + download_file_path)

   try:
        os.makedirs(os.path.dirname("C:/Users/**user**/Downloads/Requested Downloads/Iris.csv"))
   except:
        pass
   with open(download_file_path, "wb") as download_file:
        blob_client = BlobClient.from_blob_url(sas_url)
        download_stream = blob_client.download_blob().readall()
        download_file.write(download_stream)

    print("Download Complete!")

    logging.info("********   - Checkpoint 2 -   **********")

    return func.HttpResponse(f"Hello! You've requested the {dataset_1}, {dataset_2}, {dataset_3} in {dataset_format}. This script has run successfully and your download(s) are complete!")```
    
1

There are 1 answers

0
Jim Xu On BEST ANSWER

Azure Functions is a PaaS tier service that you can use to build your backend services. It cannot open a new browser. If you want to do that, I suggest can use javascript on your frontend application to open a new browser session/tab.

So regarding your need, I suggest you use Azure function to generate sas token for you blob then return sas token URL to frontend application and download the file. Regarding how to download, you can refer to the following code

function myFunction(){
    var a = document.createElement("a");
  a.href = "";
 
  document.body.appendChild(a);
  a.click();
}