Alternative method for HttpResponse.TransmitFile in ASP.NET Core

175 views Asked by At

Since I want to realize downloading large size file (> 4GB) from my ASP.NET Core backend, many articles point out that HttpResponse.TransmitFile in the .NET Framework could achieve my goal.

However, it seems that HttpResponse.TransmitFile is no longer available in .NET Core.

Does anyone know what the alternative to HttpResponse.TransmitFile is in .NET Core? I can't tell you how much I appreciate your relevant answers.

2

There are 2 answers

0
Panagiotis Kanavos On

I suspect the real question isn't finding the alternative to TransmitFile (it's return File(path) or return File(stream) but handling request ranges so clients can download large files in chunks that can be retried if interrupted.

Luckily, this is already supported by both the ControllerBase.File method available since ASP.NET Core 2.1 and the Results.File method used in Minimal APIs (among others). Range processing is off by default but can be enabled by passing true to the enableRangeProcessing parameter, for example :

public class VideoController : Controller
{
    [HttpGet, Route("videos/video.mp4")]
    public IActionResult Index()
    {
        return File("d:\Videos\video.mp4", "video/mp4", true);
    }
}

Even better, the Static Files provider also supports ranges (and response compression) out of the box. If the large files are in a specific folder, you could serve them with :

app.UseStaticFiles(new StaticFileOptions
{
    FileProvider = new PhysicalFileProvider("path\to\large\files"),  
    RequestPath = "/Videos"
});

If you want to use response compression in your own actions you'll have to either enable it on the web server or explicitly through the response compression middleware:

builder.Services.AddResponseCompression(options =>
{
    options.EnableForHttps = true;
});

var app = builder.Build();

app.UseResponseCompression();

From that point on, it's up to the client to retrieve specific chunks and retry them. Download utilities typically download large files in chunks and retry failed parts automatically. Khalid Abuhakmeh describes the process and how it works with ASP.NET Core in a short blog post.

In C#, HttpClient can request specific chunks of the file and even download them concurrently using the Range header, eg :

var req = new HttpRequestMessage 
{ 
    RequestUri = new Uri( url ) 
};
req.Headers.Range = new RangeHeaderValue( 0, 999 );

var resp = await client.SendAsync(req);

if (resp.IsSuccessStatusCode)
{
    using var tempFile=File.Create("chunk.001");
    await resp.Content.CopyToAsync(tempFile);
}

If you have a list of ranges, you can use it to download the remote file in parallel and combine the chunks later:

record MyRange(long Start,long End);

async Task DownloadChunkAsync(HttpClient client,Uri uri,MyRange range, CancellationToken ct)
{
    var req = new HttpRequestMessage 
    { 
        RequestUri = uri
    };
    req.Headers.Range = new RangeHeaderValue( range.Start, range.End);
    var resp = await client.SendAsync(req,ct);
    if (resp.IsSuccessStatusCode)
    {
        using var tempFile=File.Create($"chunk.{range.Start,5}");
        await resp.Content.CopyToAsync(tempFile);
    }
}

var ranges=CalculateRanges(...);
var uri=new Uri( url ) ;

//Concurrent downloads
await Parallel.ForEachAsync(ranges,(range,ct)=>{
    await DownloadChunkAsync(client,uri,range,ct);
}
// Combine the chunks
using(var finalStream=File.Create("finalFile.mp4"))
{
    foreach(var range in ranges)
    {
        using var chunkStream=File.OpenRead($"chunk.{range.Start,5}");
        chunkStream.CopyToAsync(filalStream);
    }
}
5
Jason Pan On

You can use below sample to implement the requirement. For more details, you can check the blogs Streaming Zip on ASP.NET Core.

private static HttpClient Client { get; } = new HttpClient();
[HttpGet]
public async Task<FileStreamResult> Get()
{
    // get your stream
    var stream = await Client.GetStreamAsync("https://raw.githubusercontent.com/StephenClearyExamples/AsyncDynamicZip/master/README.md");

    return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
    {
        FileDownloadName = "README.md"
    };
}

For zip:

private static HttpClient Client { get; } = new HttpClient();
[HttpGet]
public IActionResult Get()
{
    var filenamesAndUrls = new Dictionary<string, string>
    {
        { "README.md", "https://raw.githubusercontent.com/StephenClearyExamples/AsyncDynamicZip/master/README.md" },
        { ".gitignore", "https://raw.githubusercontent.com/StephenClearyExamples/AsyncDynamicZip/master/.gitignore" },
    };

    return new FileCallbackResult(new MediaTypeHeaderValue("application/octet-stream"), async (outputStream, _) =>
    {
        using (var zipArchive = new ZipArchive(new WriteOnlyStreamWrapper(outputStream), ZipArchiveMode.Create))
        {
            foreach (var kvp in filenamesAndUrls)
            {
                var zipEntry = zipArchive.CreateEntry(kvp.Key);
                using (var zipStream = zipEntry.Open())
                using (var stream = await Client.GetStreamAsync(kvp.Value))
                    await stream.CopyToAsync(zipStream);
            }
        }
    })
    {
        FileDownloadName = "MyZipfile.zip"
    };
}

This solution has all the same advantages of our previous non-Core solution:

  1. All I/O is asynchronous. At no time are any threads blocked on I/O.
  2. The zip file is not held in memory. It is streamed directly to the client, compressing on-the-fly.
  3. For large files, not even a single file is read entirely into memory. Each file is individually compressed on-the-fly.