I want to convert an XML file to CSV which resides inside my Azure blob storage container using Azure function via Python code, but the problem is I need to pass the XML file name and the CSV file name inside the query parameter, while test/run. How can I do it?
I wrote a Python code but got a 500 error
from io import StringIO
import logging
from azure.storage.blob import BlobServiceClient
import azure.functions as func
import pandas as pd
import requests
constrin = "conn-str"
connection_string = constrin
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(r"func-data-01")
fileName = requests.params.get('fileName')
File_Name = container_client.get_blob_client(fileName)
blob = File_Name.download_blob().readall().decode("utf-8") #decode depends on the type of file. UTF is for CSV files.
data = pd.read_csv(StringIO(blob)) #the file will be accessed and the data will be moved to dataframe and can be used further
print(f"file is found an the name is: {File_Name}")
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
else:
return func.HttpResponse(
"This HTTP-triggered function executed successfully. Pass a name in the query string or the request body for a personalized response.",
status_code=200
)
Final-Output:
Query-param:


I have reproduced your requirement in my environment.
I could read XML file which is available in Storage container and convert it to CSV file using Python Azure function.
My Python Azure function code:
This function code downloads the XML file to local, converts to CSV and then upload it to Azure Storage Container.
init.py function
Portal: