I am new to cryptopp and have been struggling for a while with the creation of private keys for ECDSA signing.
I have a hex encoded private exponent E4A6CFB431471CFCAE491FD566D19C87082CF9FA7722D7FA24B2B3F5669DBEFB
. This is stored as a string.
I want to use this to sign a text block using ECDSA. My code looks a bit like this
string Sig::genSignature(const string& privKeyIn, const string& messageIn)
{
AutoSeededRandomPool prng;
ECDSA<ECP, SHA256>::PrivateKey privateKey;
privateKey.AccessGroupParameters().Initialize(ASN1::secp256r1());
privateKey.Load(StringSource(privKeyIn, true, NULL).Ref());
ECDSA<ECP, SHA256>::Signer signer(privateKey);
// Determine maximum size, allocate a string with that size
size_t siglen = signer.MaxSignatureLength();
string signature(siglen, 0x00);
// Sign, and trim signature to actual size
siglen = signer.SignMessage(prng, (const byte *) messageIn.data(), (size_t) messageIn.length(), (byte*)signature.data());
signature.resize(siglen);
cout << signature.data() << endl;
return signature;
}
This code generates the following error in Visual studio on the when I try to do privateKey.load(...)
First-chance exception at 0x7693C42D in DLLTest.exe: Microsoft C++ exception: CryptoPP::BERDecodeErr at memory location 0x0033EEA8.
Unhandled exception at 0x7693C42D in DLLTest.exe: Microsoft C++ exception: CryptoPP::BERDecodeErr at memory location 0x0033EEA8.
I am guessing I am doing something a bit stupid... any help would be great???
PS I had a similar issue using ECDH for GMAC generation but got round this by saving the key as a SECByteBlock but this 'trick' doesnt seem to work in this case.
You have a private exponent, and not a private key. So you should not call
Load
on it. That's causing the Crypto++BERDecodeErr
exception.The answer is detailed on the ECDSA wiki page, but its not readily apparent. You need to perform the following to initialize the
privateKey
given the curve and exponent::Prepending the
"0x"
ensures theInteger
class will parse the ASCII string correctly. You can also append a"h"
character to the string. You can see the parsing code forInteger
class at Integer.cpp around line 2960 in theStringToInteger
function.Here's another way to do the same thing:
The
HexDecoder
will perform the ASCII to binary conversion for you. The buffer held by theHexDecoder
will then be consumed by theInteger
using itsDecode (BufferedTransformation &bt, size_t inputLen, Signedness=UNSIGNED)
method.And here is another way using
HexDecoder
(Crypto++ is as bad as scripting languages at times :)...After initializing the key, you should validate it:
This will output binary data:
If you want something printable/readable, run it though a Crypto++ HexEncoder.