Why doesn't md5 (and other hash algorithms) output in base32?

110 views Asked by At

It seems like most hashes (usually in base16/hex) could be easily represented in base32 in a lossless way, resulting in much shorter (and more easily readable) hash strings.

I understand that naive implementations might mix "O"s, "0"s, "1"s, and "I"s, but one could easily choose alphabetic characters without such problems. There are also enough characters to keep hashes case-insensitive. I know that shorter hash algorithms exist (like crc32), but this idea could be applied to those too for even shorter hashes.

Why, then, do most (if not all) hash algorithm implementations not output in base32, or at least provide an option to do so?

0

There are 0 answers