Azure Data Factory Event Based Trigger not working as Expected

728 views Asked by At

Storage Explorer Version: 1.14.2 Build Number: 20200715.2 Platform/OS: Windows 10 Architecture: ia32 Regression From: Not sure

Hello All,

I have created a event based trigger for triggering some pipelines. So the issue is when i try to add csv files manually through storage explorer to designated blob location my trigger is working perfectly, but when an external source like I have a backend python code which is pushing the files into the blob location, when this is happening the event based trigger is not triggering. I just checked the content type for manual upload the content type is vnd.ms-excel and for python code based upload is octet-stream. Is the issue something related to this or any other. My Storage explorer version is 1.14.2.

1

There are 1 answers

0
Joseph  Xu On

Please check the version of the Python SDK.
I'm using Python v12 SDK to upload blob to Azure blob storge and it works well.

Here is my python code:

import os, uuid
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient, __version__

try:
    print("Azure Blob storage v" + __version__ + " - Python quickstart sample")
    # Quick start code goes here
    # Create the BlobServiceClient object which will be used to create a container client
    connect_str = os.getenv('AZURE_STORAGE_CONNECTION_STRING')
    
    blob_service_client = BlobServiceClient.from_connection_string(connect_str)
    

    # Get container
    container_name = "test"     

    print(container_name)
    # Create the container
    container_client = blob_service_client.get_container_client(container_name)
    
    
    # Create a file in local data directory to upload and download
    local_path = "./data"
    local_file_name = "Test.csv"
    upload_file_path = os.path.join(local_path, local_file_name)
    
    print(upload_file_path)
    
    # Create a blob client using the local file name as the name for the blob
    blob_client = blob_service_client.get_blob_client(container=container_name, blob=local_file_name)
    
    print("\nUploading to Azure Storage as blob:\n\t" + local_file_name)
    
    # Upload the  file
    with open(upload_file_path, "rb") as data:
        blob_client.upload_blob(data)
except Exception as ex:
    print('Exception:')
    print(ex)

When I use python to upload the csv file to the Azure blob storage, the event trigger triggered the pipeline running and it works well:

enter image description here