The following code is a very simple self-hosted file transfer service.
It works for files smaller than 100 MB.
How to make this work for 1 GB files or more? (which might exceed php.ini's upload_max_filesize and post_max_size)
More generally, how can we do a POST XMLHttpRequest file upload (with a file appended to a FormData object), with a file bigger than the server's RAM?
Is there a way to to do the XHR upload by chunks, and if so, how?
<?php
if ($_SERVER['REQUEST_METHOD'] === 'POST') {
$localfname = $_POST['fname'];
$data = file_get_contents($_FILES['data']['tmp_name']);
$fname = '';
for ($i = 0; $i < 4; $i++) {
$fname .= '0123456789abcdefghijklmnopqrstuvwxyz'[rand(0, 35)];
}
$trimmedfname = preg_replace('/\s+/', '', basename($localfname));
$file = fopen('files/' . $fname . '_' . $trimmedfname, 'w');
fwrite($file, $data);
fclose($file);
$link = 'files/' . $fname . '_' . $trimmedfname;
echo 'link: <div id="link"><a href="' . $link . '">' . $link . '</a></div>';
die();
}
?>
<body>
<div id="container" style="width: 200px; height: 200px; background-color: #eee; padding: 1em;">drag and drop your file here!</div>
<script>
var $ = document.querySelector.bind(document);
var readfiles = files => {
var formData = new FormData();
formData.append('fname', files[0].name);
formData.append('data', files[0]);
$("#container").innerHTML = 'beginning ulpoad... progress: ';
var xhr = new XMLHttpRequest();
xhr.open('POST', '');
xhr.onload = () => {
$("#container").innerHTML = xhr.responseText;
};
xhr.upload.onprogress = (event) => {
if (event.lengthComputable) {
$("#container").innerHTML = 'ulpoading... progress: ' + (event.loaded / event.total * 100 | 0) + '%<br>';
}
};
xhr.send(formData);
}
document.body.ondragover = () => { $("#container").innerHTML = 'drop your file here...'; return false; };
document.body.ondrop = (e) => { e.preventDefault(); readfiles(e.dataTransfer.files); };
</script>
</body
True, chunking a file upload involves breaking down the file into smaller parts (chunks), sending each chunk separately, and then reassembling these parts on the server.
That would allow you to bypass the
upload_max_filesizeandpost_max_sizelimits inphp.inibecause each chunk can be kept small enough to fit within these limits. But you need to modify both the client-side and server-side code to handle chunked uploads.On the client-side, modify your JavaScript to slice the file into chunks and upload each piece sequentially (simplified example of a more complex process):
The JavaScript function
chunkedUpload(file)should be triggered with the file you want to upload. You can modify your drag and drop or file input listeners to use this function.On the server side, modify your PHP script to handle chunked uploads. The script needs to append each chunk to a temporary file. Once all chunks have been received, it can then process the file as needed.
As a basic example (For a production environment, you should include error handling, security checks, and possibly support for concurrent uploads):
Yes, your approach to calling the
chunkedUploadfunction should work.When a user drops a file onto the designated area, the event listener prevents the default action (which could be opening the file or navigating to it), retrieves the file from the
dataTransferobject, and then passes this file to thechunkedUploadfunction for processing.You can add some additional steps, to make sure a smooth user experience (a bit as in this example):
That would provide feedback during the drag-over and drag-leave events, making it clearer to users where they should drop their files. Once the file is dropped, it proceeds with the upload and gives an immediate visual response that something is happening.