What is the best lossless compression algorithm for random data

3.5k views Asked by At

I need to compress a random stream data like [25,94,182,3,254, ...]. The number of data are close to 4 million. I currently only get 1.4x ratio by Huffman code. The LZW algorithm I tried is take too much time to compress. I hope to find out an efficiency compression method and still have high compression rate, at least 3x. Is there another algorithm that would be able to compress this random data more better?

1

There are 1 answers

0
Aki Suihkonen On

It depends on the distribution of the rng. A compression ratio of 1:1.4 suggest that it's not uniform or not good. Huffman and arithmetic coding are practically the only options*, since there is no other correlation between successive entries of good RNG.

*To be precise, the best compression scheme has to be 0-order statistical compression that is able to allocate a variable number of bits for each symbol to reach the Shannon entropy

H(x) = -Sigma_{i=1}^{N} P(x_i) log_2 P(x_i)

The theoretical best is achieved by arithmetical coding, but other encodings can come close by chance. Arithmetic coding can allocate less than one bit per symbol, where as Huffman, or Golomb coding need at least one bit per symbol (or symbol group).