I'm using fineuploader (https://fineuploader.com) with concurrent chunking to upload files to the server. For small files its working great but for large files I'm seeing an issue where the merged file ends up as corrupt.
E.g I tried a 2.7GB Zip file, the final file is the exact correct size in bytes but cannot be extracted ('File is invalid' and 'Unconfirmed start of archive').
I've also tried with a 850MB PDF which ended up corrupted and could not be viewed.
All the tmp files are uploaded ok and the number matches the expected amount from 'qqtotalparts'. I've tried changing the partSize from 2000000 to a larger size but still see the same issue.
Not sure where I'm going wrong but any help or suggestions would be greatly appreciated!
Javascript:
var manualUploader = new qq.FineUploader({
element: document.getElementById("fine-uploader-manual-trigger"),
template: 'qq-template-manual-trigger',
request: {
endpoint: "application.aspx?p=testnewhtml5storefiles"
},
deleteFile: {
enabled: true,
endpoint: "application.aspx?p=testnewhtml5storefiles"
},
chunking: {
enabled: true,
partSize: 10000000,
concurrent: {
enabled: true
},
success: {
endpoint: "application.aspx?p=testnewhtml5storefiles?done=1"
}
},
resume: {
enabled: true
},
retry: {
//enableAuto: true,
showButton: true
},
thumbnails: {
placeholders: {
waitingPath: 'images/waiting-generic.png',
notAvailablePath: 'images/not_available-generic.png'
}
},
autoUpload: false,
debug: true
});
Server side:
Dim totalParts as integer = CInt(Request("qqtotalparts") )
Dim partIndex as integer = CInt(Request("qqpartindex") )
Dim filename as string = Request("qqfilename")
Dim totalFileSizeName As string = Request("qqtotalfilesize")
Dim uploadedTemp as string = "\\testing\chunks\"
Dim uploadedLocation As String = "\\testing\"
System.IO.Directory.CreateDirectory(uploadedTemp)
Dim filePath As String = System.IO.Path.Combine(uploadedTemp, partIndex & Convert.ToString(".tmp"))
If Not System.IO.File.Exists(filePath) Then
Dim inputStream As System.IO.Stream = request.Files(0).InputStream
Using fileStream As System.IO.FileStream = System.IO.File.OpenWrite(filePath)
inputStream.CopyTo(fileStream)
End Using
End If
'check if we have all parts
If partIndex = (totalParts - 1) Then
'merge parts into one file
mergeTempFiles(uploadedTemp, uploadedLocation, filename)
End If
Public Sub mergeTempFiles(pathOrigin As String, pathToSave As String, filename As String)
Dim tmpfiles As String() = System.IO.Directory.GetFiles(pathOrigin, "*.tmp")
If Not System.IO.Directory.Exists(pathToSave) Then
System.IO.Directory.CreateDirectory(pathToSave)
End If
Dim outPutFile As New System.IO.FileStream(pathToSave & filename, System.IO.FileMode.Create, System.IO.FileAccess.Write)
For Each tempFile As String In tmpfiles
Dim bytesRead As Integer = 0
Dim buffer As Byte() = New Byte(1023) {}
Dim inputTempFile As New System.IO.FileStream(tempFile, System.IO.FileMode.OpenOrCreate, System.IO.FileAccess.Read)
While (InlineAssignHelper(bytesRead, inputTempFile.Read(buffer, 0, 1024))) > 0
outPutFile.Write(buffer, 0, bytesRead)
End While
inputTempFile.Close()
Next
outPutFile.Close()
End Sub
Just in case anyone else has this same issue I found a solution.
Turns out the .tmp files for certain file types (such as PDFs) need to be merged in the same order they were created.
So I needed to check that all parts had been recieved then loop through them and copy to my output file.
E.g.