I'm developing a WPF App, in which I want to encrypt and compress heavy DataSet or Objects (<2MB) and send it across over the network. The other party would decompress and decrypt the data and consume it. This process will be on both sides from Server(WCF) to Client and Client to Server.
- I want efficient Compression Class (would like to stick to .Net compression classes).
- Takes less time to compress and decompress.
- compress ratio high while data retrieval should be 100%.
Can anyone suggest me about the compression classes (DeflateStream/GzipStream).
Thanks
VJ
Most people confuses "compressed final size" == "better network performance". In general sense, using "better-than-deflate" class compression algorithms can reduce transfer bandwidth, but they can increase total transfer time (compression+transfer+decompression). In that sense LZ class compressor seems best. Fastest implementation can be QuickLZ or LZ4. They both have a C# version. But, their implementation is not exactly as e.g. DeflateStream (actually simplier to use). QuickLZ has growing usage on network related applications while LZ4 patched into Apache Hadoop source trunk recently instead of Google's Snappy.
If you need more compression, you can grab LZMA SDK which consists of managed LZMA compression/decompression methods. But, I should warn you, LZMA's memory consumption is usually very high (depends on parameters). So, spawning several LZMA-powered threads is not likely you really want.
If you still need more and more compression, have a look at PPM or bitwise-CM class algorithms. PPM is very good on textual data and has an average speed (usually 2-3 MiB/sec). On the other hand, CM is very good on binary data. Their memory consumption can be high (depends on parameters) and very slow (1 MiB/sec to even several bytes/sec, depends on algorithms). Unfortunately, you may only find PPM .NET implementation on the internet. But, CM is really troublesome to find in .NET due to high complexity. I have written order0 bitwise coder in .NET that can be extended to a proper CM with additional models if you really need.