Why does windows certutil and openSSL display CSR (pkcs#10) signature bytes differently?
I ran this command in windows: certutil -dump [p10_filename]
output:
PKCS10 Certificate Request:
Version: 1
Subject: <*** REMOVED ***>
...
Signature Algorithm:
Algorithm ObjectId: 1.2.840.113549.1.1.4 md5RSA
Algorithm Parameters:
05 00
Signature: UnusedBits=0
0000 ff 9d 4b 25 15 ae 79 32 66 7b 9f 4e a4 17 1e f8
0010 3a 64 69 f5 99 a3 7b 8e c2 ee 2d 61 ef ec 78 c9
0020 9d bb 10 b3 60 36 96 f6 a0 3f 85 c4 3b 2e 16 25
0030 52 d9 81 a1 aa 56 d0 54 6c 28 12 7f 64 2d cd 1b
0040 83 3c 03 ad 74 27 02 a1 55 42 d5 12 8e dd dc cf
0050 a7 42 43 76 7d aa 47 d2 3e 62 b8 30 a3 83 1d 8b
0060 61 7f f4 9e ba f3 bd b6 d2 28 9b 5a b9 b1 38 06
0070 d4 42 85 91 64 d3 9d 6d 6a c4 3c f7 3b 6e 93 0a
0080 a8 b2 fd 2f 3e f5 ed fd fa a3 d0 d9 7a b6 71 96
0090 d9 03 be 32 d9 70 9d 5a f2 4a 5f db df 2a 8b cd
00a0 12 d9 71 29 e2 93 73 51 a0 ca f2 3c c8 b1 38 87
00b0 16 67 23 2e 2a 96 45 8f fe eb 8c 01 d7 b9 2e 3e
00c0 e6 7e 08 71 3b 5a ca 6a 23 29 73 49 88 84 1f 21
00d0 3b 83 ce 77 55 a3 31 fa d2 b5 61 c2 53 39 9b bc
00e0 e2 1d db d1 1b f7 27 a6 81 43 d2 c7 c8 f3 75 ad
00f0 3e 37 23 de 34 b3 8a 57 be 11 22 ef 4c c2 81 2f
Signature matches Public Key
Key Id Hash(rfc-sha1): 4e 6c d8 61 0c 91 a2 4a 07 ed af ae 05 c9 fb 95 cd c9 cc 7e
Key Id Hash(sha1): b7 63 38 21 9e 21 6a 82 eb bb a4 8e bc 68 5c 6f 07 a9 72 07
CertUtil: -dump command completed successfully.
When I ran the same command in openssl I got a different result?
openssl req -in <csr_filename> -noout -text
Certificate Request:
Data:
Version: 0 (0x0)
.... clipped ...
Signature Algorithm: md5WithRSAEncryption
2f:81:c2:4c:ef:22:11:be:57:8a:b3:34:de:23:37:3e:ad:75:
f3:c8:c7:d2:43:81:a6:27:f7:1b:d1:db:1d:e2:bc:9b:39:53:
c2:61:b5:d2:fa:31:a3:55:77:ce:83:3b:21:1f:84:88:49:73:
29:23:6a:ca:5a:3b:71:08:7e:e6:3e:2e:b9:d7:01:8c:eb:fe:
8f:45:96:2a:2e:23:67:16:87:38:b1:c8:3c:f2:ca:a0:51:73:
93:e2:29:71:d9:12:cd:8b:2a:df:db:5f:4a:f2:5a:9d:70:d9:
32:be:03:d9:96:71:b6:7a:d9:d0:a3:fa:fd:ed:f5:3e:2f:fd:
b2:a8:0a:93:6e:3b:f7:3c:c4:6a:6d:9d:d3:64:91:85:42:d4:
06:38:b1:b9:5a:9b:28:d2:b6:bd:f3:ba:9e:f4:7f:61:8b:1d:
83:a3:30:b8:62:3e:d2:47:aa:7d:76:43:42:a7:cf:dc:dd:8e:
12:d5:42:55:a1:02:27:74:ad:03:3c:83:1b:cd:2d:64:7f:12:
28:6c:54:d0:56:aa:a1:81:d9:52:25:16:2e:3b:c4:85:3f:a0:
f6:96:36:60:b3:10:bb:9d:c9:78:ec:ef:61:2d:ee:c2:8e:7b:
a3:99:f5:69:64:3a:f8:1e:17:a4:4e:9f:7b:66:32:79:ae:15:
25:4b:9d:ff
I opened up the file in textpad using the HEX-EDITOR and the openSSL seems to be the correct raw data? Perhaps it's some encoding difference? Or some kind of wrapper that certutil is using to display the data that openSSL doesn't show?
The Microsoft API's in general encodes/decodes numbers as little endian values. However, for signatures the exact order has been specified, and officially it is not even a number. See RSA PKCS#1 v2.1 and
I2OSP
in particular. I2OSP encodes an integer value (of any size) a s a statically sized big endian number. This is also reflected by the fact that the number is within a BIT STRING instead of an ASN.1 INTEGER.So the Microsoft representation is incorrect. It is of course required to treat the signature as a number to do anything meaningful with it, so it is probable that Microsoft first parses the signature into a number, and then simply displays the number in hex using it's own internal, little endian representation. So the signature value is the same, the bytes are just reversed so you see a mirror image of the signature.