I'm using dotnet zip in the following way:
using (ZipFile zip = new ZipFile(@"E:\test2.zip"))
{
zip.ParallelDeflateThreshold = -1;
zip.UseZip64WhenSaving = Zip64Option.Always;
zip.SaveProgress += SaveProgress;
zip.AddDirectory(@"E:\Samples");
zip.Save(@"E:\test2.zip");
}
This code worked fine until I tried to zip a directory that hold close to 1 million files.
The scenario:
1. If the files are rather heavy:
The folder contains 928614 files in allot of sub directories, the complete folder size is 134GB but most of the files are only a few kb, the biggest one is about 60 mb.
the application gets stuck during the save operation around the file 870000. I know this by listening to the progress event.
public void SaveProgress(object sender, SaveProgressEventArgs e) {
if (e.EntriesSaved != 0)
{
System.Console.WriteLine(e.EntriesSaved.ToString());
}
I left the application running for an entire night and nothing happened, in one of the test runs the application crashed completely (it was like someone press CTRL F5 in the visual studio)
I also tried to run the ZipIt.exe provided by the dotnetzip tools package and the same thing happened.
2.If the files are very small (test purposes)
The package contains 970000 files in allot of sub directories all the files are a txt file with the letter "a" ,the complete folder size is 1mb.
the save operation goes through all the files (as i can see using the saveProgress event) but in the end i get an outofmemoryexception .
I also tried to run the ZipIt.exe provided by the dotnetzip tools package and in that case the operation completed successfully (So confusing!)
Is there a max number of entires that dotnetzip supports? is there a work around? Thanks so much in advance.
The original ZIP specification allowed up to 65535 entries in a zip archive. You should enable Zip64 for the archive: