Large files download/upload Blob Storage Azure

802 views Asked by At

I'm working with some interactions with Azure blob storage. Download, Upload and get files. But I'm having a lot of problems when the file is large (100mib +). The first problem is the slowness, both in downloading and uploading. The front application, at a certain point, needs to list some content and for each iteration, the backend application needs to go to storage to get the binary from the file (it's not my fault haha).

I read some articles that helped me a little, but I still need some tips. I would be very grateful if you could help me:

How to improve this performance? I read something about not using MemoryStream and parallelism requests (https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-tune-upload-download)

To not make it too long, I'll put some parts of the code:

        public async Task<bool> Upload(IFormFile CaminhoRelativo, string CaminhoRelativo, string contentType, bool NaoComprimir = false)
        {
            string path = BlobHelper.GerarArquivoPath(arquivo.FileName, CaminhoRelativo);
            await _cloudBlobContainer.CreateIfNotExistsAsync();
            CloudBlockBlob blockBlobReference = _cloudBlobContainer.GetBlockBlobReference(path);
            try
            {
                MemoryStream memoryStream = BlobHelper.GerarArquivoStream(arquivo);
                if (NaoComprimir)
                {
                    await blockBlobReference.UploadFromStreamAsync(memoryStream);
                    CloudBlockBlob blockBlobReference2 = _cloudBlobContainer.GetBlockBlobReference(CaminhoRelativo + "/" + arquivo.FileName);
                    blockBlobReference2.Properties.ContentType = contentType;
                    await blockBlobReference2.SetPropertiesAsync();
                }
                else
                {
                    using MemoryStream memStream = new MemoryStream(CompressaoHelper.ComprimirArquivoGZIP(memoryStream));
                    await blockBlobReference.UploadFromStreamAsync(memStream);
                }
            }
            catch (StorageException ex)
            {
                throw new Exception("Erro ao tentar fazer o upload do arquivo: " + path + ". Detalhes: " + ex.Message);
            }

            return true;
        }

What is the best approach to conversions and traffic across the information network? Byte array, Base 64?

Thank you very much!

1

There are 1 answers

0
Venkatesan On

Large files download/upload Blob Storage Azure

I agree with Panagiotis Kanavos 's comment if you need to upload/download a large file you can use the Azure data movement library.

Here is my sample code to upload a large file size of 250 MB file transfer from local to Azure blob storage.

Code:

class program
{
    public static void Main(string[] args)
    {
        string storageConnectionString = "<Storage connection string>";
        CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
        CloudBlobClient blobClient = account.CreateCloudBlobClient();
        CloudBlobContainer blobContainer = blobClient.GetContainerReference("test");
        blobContainer.CreateIfNotExists();
        string sourceBlob = @"xxxxx";
        CloudBlockBlob destPath = blobContainer.GetBlockBlobReference("sample.pdf");
        TransferManager.Configurations.ParallelOperations = 64;
        // Setup the transfer context and track the download progress
        SingleTransferContext context = new SingleTransferContext
        {
            ProgressHandler = new Progress<TransferStatus>(progress =>
            {
                Console.WriteLine("Bytes Upload: {0}", progress.BytesTransferred);
            })
        };
        // upload the blob
        Stopwatch stopWatch = Stopwatch.StartNew();
        var task = TransferManager.UploadAsync(
        sourceBlob, destPath, null, context, CancellationToken.None);
        task.Wait();
        stopWatch.Stop();
        Console.WriteLine("Elapsed time: {0}", stopWatch.Elapsed);
    }
}

The above code executed and uploaded a large file to Azure blob storage.

Portal:

enter image description here