I'm working on creating an Azure Function which will take in a POSTed file and process it. I have the basics setup and I can successfully POST a small file. Whenever I POST a large file I get the following error message.
A ScriptHost error has occurred
Exception while executing function: Functions.HttpTriggerCSharp. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'request'. System.ServiceModel: The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element. The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.
Exception while executing function: Functions.HttpTriggerCSharp
Executed: 'Functions.HttpTriggerCSharp' (Failed)
Function had errors. See Azure WebJobs SDK dashboard for details. Instance ID is '5fc0eaa2-0159-4185-93e4-57a4b2d4bb7f'
I haven't been able to find any Azure Functions documentation on where to set that property. Is it possible to increase the maximum message size for Azure Functions?
Edit
function.json
{
"disabled": false,
"bindings": [
{
"name": "request",
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [ "GET", "POST" ],
"route": "test"
},
{
"name": "response",
"type": "http",
"direction": "out"
}
]
}
run.csx
using System.Net;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage request, TraceWriter log)
{
log.Info($"C# HTTP trigger function processed a request. RequestUri={request.RequestUri}");
// parse query parameter
string name = request.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
.Value;
return name == null
? request.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
: request.CreateResponse(HttpStatusCode.OK, "Hello " + test.Value);
}
Just an idea- would you consider to split the file content post to something like an azure blob, and then post the link to the file (can be protected by a timer based link) to have your azure function grab it from there? It is very common practice to split your large file upload from your primary post request which is not optimised for handling large file.