I'm trying to work with a custom activity in Data Factory to execute in a batch accounts pool a python batch stored in a blob storage.
I followed the Microsoft tutorial https://learn.microsoft.com/en-us/azure/batch/tutorial-run-python-batch-azure-data-factory
My problem is when I execute the ADF pipeline the activity failed:

When I check in the Batch Explorer tool, I got this BlobAccessDenied message:

Depending of the execution, it happens on all ADF reference files but also for my batch file.
I have linked the Storage Account to the Batch Accounts

I'm new to this and I'm not sure of what I must do to solve this.
Thank you in advance for your help.
I tried to reproduce the issue and it is working fine for me. Please check the following points while creating the pipeline.
In ADF Portal, click on left ‘Manage’ symbol and then click on +New to create Blob Storage linked service.
Search for “Azure Blob Storage” and then click on Continue
Fill the required details as per your Storage account, test the connection and then click on apply.
Similarly, search for Azure Batch Linked Service (under Compute tab).
Fill the details of your batch account, use the previously created Storage Linked service under “Storage linked service name” and then test the connection. Click on save.
Later, when you will create custom ADF pipeline, under “Azure Batch” Tab, provide the Batch Linked Service Name.
Under “Settings” Tab, provide the Storage Linked Service name and other required information. In "Folder Path", provide the Blob name where you have main.py and iris.csv files.
Once this is done, you can Validate, Debug, Publish and Trigger the pipeline. Pipeline should run successfully.
Once pipeline ran successfully, you will see the iris_setosa.csv file in your output Blob.