Decompressing Pako gzipped string with Python zlib

440 views Asked by At

Say I want to compress the following JSON: {"test": 1, "test2": 2}

I do that on the client side using Pako JS lib:

const test_json = JSON.stringify(test)
const gz_str = pako.gzip(test_json, { to: 'string' })
// Returns string ����������«V*I-.Q²2Ô�3��¬�j�¨äK�����

Decompressing with Pako works just fine

const result = pako.ungzip(gz_str, { to: 'string' })
// Returns '{"test": 1, "test2": 2}'

Now if I try to decompress on server side with Python zlib:

import zlib
gzipped_string = '����������«V*I-.Q²2Ô�3��¬�j�¨äK�����'
s = zlib.decompress(gzipped_string.encode('utf-8'), 31)

I get zlib.error: Error -3 while decompressing data: incorrect header check. I get the same error if I try to ask zlib for automatic header detection with zlib.MAX_WBITS|32

I've found many similar issues (like Compressed with pako(zlib in javascript), decompressing with zlib(python) not working or zlib.error: Error -3 while decompressing: incorrect header check), but were due to either to some encoding/decoding issue, or wrong windowBits option in zlib decompress method.

Some solutions were based on base64, but I want to keep the data as a String, not raw bytes. What am I missing ?

0

There are 0 answers