I'm working on a project where I need to upload very large files (approximately 500GB) from a ReactJS front-end to Azure Blob Storage, using a Python-based FastAPI or Flask backend. My goal is to optimize the upload process to make it as efficient as possible.
I'm aware that chunking the file and making multiple calls to the backend is a common approach for large file uploads, but it can be time-consuming, especially given the large file size.
I'd like to achieve the following:
- Upload the 500GB file in less than 20 minutes.
- Take into account a stable internet connection with a speed of around 100Mbps.
What are the best practices and tools I should consider to achieve this goal efficiently? Are there any specific libraries or methods that can help with optimizing the upload process? Additionally, are there any configuration or settings I should be aware of in Azure Blob Storage for such large uploads?
I'm open to any suggestions or recommendations to streamline this large file upload process effectively. Thank you for your insights!
You can try the below method to improve your blob upload performance:-
File upload form with an input field for selecting a file and a button to trigger the upload. Sample code for a combination of a FastAPI backend and a React frontend for uploading to Azure Blob Storage.
React frontend:
File upload form with an input field for selecting a file and a button to trigger the upload.
FastAPI Backend:
The
/upload/
endpoint allows the client to upload a file to Azure Blob Storage. It accepts a file upload using theUploadFile
type.