storing each and every chunk of data php

1.9k views Asked by At

Hello i am trying to upload a data to my server, but my requirement is if the uploaded content is a large file and if the connection is disconnected we need to upload the same file from where it stopped and not to start again. I can upload my file normally to the server but i am unable to resume my upload.. Is there any other way i can do this?? Following is my code.

RESUME.PHP

    <?php

    $host = "localhost";
    $db_uname = "root";
    $db_pass="";
    $db_name = "upload";
    $cont=mysql_pconnect($host,$db_uname,$db_pass) or die("Too many connection, please try later ");
mysql_select_db($db_name);
 
    ob_clean();
    if($_POST['submit'])
    {

     define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk

    // Read a file and display its content chunk by chunk
      function readfile_chunked($filename, $retbytes = TRUE) {
     mysql_query("INSERT INTO uploads(chunk) VALUES('')") or die('insert query: ' . mysql_error());

 $last = mysql_insert_id();
 $file = '/tmp/'.$_FILES['file']['name']; // file where it is saved after chunking
 $buffer = '';
    $cnt =0;
   
    $handle = fopen($filename, 'rb');
 $a=file_get_contents($filename);
 
 $sql=mysql_fetch_array(mysql_query("select chunk from uploads where id='26'"));
 $b= $sql['chunk'];
 
 if(strcmp($a,$b)==0) // to check if file chunked in folder                               and database data are same
 {
 echo "success";
 
 }
 else
 {
 echo "failure";
 
 }
 //exit;
    if ($handle === false) {
      return false;
    }
    while (!feof($handle)) {
      $buffer = fread($handle, CHUNK_SIZE);
   
    // Open the file to get existing content
    $current = file_get_contents($file);
    // Append a new person to the file
     $current .= $buffer;
    // Write the contents back to the file
    file_put_contents($file, $current);
    mysql_query("update uploads set chunk =concat(chunk,'".mysql_real_escape_string($buffer)."') where id='$last'") or die('update query: ' . mysql_error());
        ob_flush();
      flush();
  
      if ($retbytes) {
        $cnt += strlen($buffer);
  //echo $cnt;
      }
  
    }
 
    $status = fclose($handle);
    if ($retbytes && $status) {
 //echo $cnt;
      return $cnt; // return num. bytes delivered like                               readfile() does.
    }
    return $status;
  }


  
    $filename = $_FILES['file']['tmp_name'];
    readfile_chunked($filename);
 }
?>

<form action="resume.php" method="post"
                        enctype="multipart/form-data">
<input type="file" name="file" size="50" />
<br />
<input type="submit"  name="submit" value="Upload File" />
</form>

I can make the code work on localhost but i have an issue using it on server as my data is not chunked and saved on server

2

There are 2 answers

2
Rustem Gafurov On

You need to read file into string with jQuery, cut that string into pieces, and the upload each piece separately.

0
user7783493 On

you can try this "resumablejs" http://resumablejs.com/, it will create the chunk of data and send it into server side, from server side you could create original file from received chunks.