The jarsigner default signature algorithms are periodically updated to stronger ones. For us, using a local certificate, this used to be SHA1withDSA. Now it's SHA256withDSA for the same certificate. SHA1withDSA is actually disabled as unsecure now.
When I verify the new JAR file with jarsigner -verify -verbose -certs xyz.jar
, the result is as expected:
Digest algorithm: SHA-256
Signature algorithm: SHA256withDSA, 1024-bit key (weak)
However, when I verify the same JAR file using openSSL with openssl cms -cmsout -inform DER -print -in LOCALSIG.DSA
, it still reports SHA1withDSA:
digestAlgorithms:
algorithm: sha256 (2.16.840.1.101.3.4.2.1)
signature:
algorithm: dsaWithSHA1 (1.2.840.10040.4.3)
sig_alg:
algorithm: dsaWithSHA1 (1.2.840.10040.4.3)
Similar with openssl pkcs7 -inform der -print_certs -text -in LOCALSIG.DSA
:
Signature Algorithm: dsaWithSHA1
Question 1: How is it possible that both tools report different signature algorithms? They used to report the same algorithms before. One of them must be wrong. Perhaps the jarsigner created an ambiguous format?
The reason why I double-checked this is that I use a custom JAR verifier, built with the Windows crypto APIs. It used to work fine for years, but can't handle this new JAR file. The first crypto API function I'm calling (CryptVerifyDetachedMessageSignature()
) already returns a rather unhelpful and undocumented error (50 = ERROR_NOT_SUPPORTED = "The request is not supported."
). I guess that's because of the ambiguous format.
Question 2: Is there any way to handle this weird signature in my code? Or should I rely on fixing this at the source, namely the JAR file?
Or current workaround is to explicitly specify jarsigner -sigalg SHA1withDSA
.
I should also mention that we don't use the original Oracle jarsigner, but the one from SAPMachine, based on OpenJDK IIRC.
Meta: this part of the Q is not really programming or development but I need formatting; will delete if necessary.
You're not paying attention to headings. The first two lines are almost directly under
d.signedData:
and are for the hash used in signing the data; the latter two pairs are underand they describe the signature in the certificate, not the signature on the data. If this is a self-signed certificate, then its signature does not affect security and can be ignored; if it is a 'real' (CA-issued) certificate you should get a new certificate from the (or any suitable) CA.
If you look further down under
signerInfos:
you should seesignatureAlgorithm:
and that is for the signature on the data.pkcs7 -print_certs -text
displays ONLY the certificate(s), not ANY information about the data signature; in fact in spite of the name it is designed for 'p7b/p7c' format data which has only certificate(s) and/or CRL(s) and no data signature at all. See my answer on a better Stack.As to your actual problem, it's probably the 'custom verifier' but as you give no information about it I can't even try to help (not that I can help much with MSCAPI/CNG anyway, but some people here can). FWIW you could see if OpenSSL thinks the signature is valid with -- (expanded) if the .SF file is self-contained i.e. the cert is selfsigned, or directly under a root, or the chain/intermediate cert(s) is(are) included:
If cert is under an intermediate that is omitted from the .SF -- which might occur if the keystore you used to sign is not set up 'canonically' -- it is still possible to fully verify but more complicated; probably better to settle for
-CAfile imed.pem -partial_chain
.Alternatively, as you reference in comments, for this case we only need to check the data signature and not the certificate at all, so just using
-noverify
and you don't need-CAfile
or to worry about the chain if any being complete.