I have a method that accepts a blob directory and a memory stream of text that needs to be read line by line and appended to an Azure AppendBlob. My storage account is a general purpose v2 (Standard/Hot tier) with Data Lake Gen 2 enabled. At present I am not able to append to blobs due to getting a 409 Conflict error even though no other users are writing to this container.
Given below is a cut down (for simplicity) version of my code with the same issue: Note the code is written for .net 4.6:
/* the following libraries are in use
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
*/
static void WriteCSV(
Stream csvStream,
CloudBlobContainer tgtContaine // this is a Gen 2 container
)
{
csvStream.Position = 0;
using (StreamReader sr = new StreamReader(csvStream))
{
string csvFileName = "test-append-blob" + $@"_{DateTime.Now.ToString("yyyy-MM-ddTHHmmss")}.csv";
CloudAppendBlob blobCSVFile = tgtContaine.GetAppendBlobReference("iso/dwh/" + csvFileName); // need to create this in this directory
if (!blobCSVFile.Exists()) blobCSVFile.CreateOrReplace(); // at this point I can see the empty file being created
while (sr.Peek() >= 0)
{
string csvLine = sr.ReadLine();
// some additional logic ommited for brevity
blobCSVFile.AppendText(csvLine); // 409 conflict occures here !!!
}
}
}
There are some reasons I am using append blob and writing line by line, one of which is the potential chance for large stream content and related memory concerns.
Could you please guide me in the correct approach to achieve this ? and why would there be 409 conflict here when no other processors or sessions are using this container/directory ?
Please let me know if I need to provide any additional information here.
Thanks in advance