Why does Curve25519 calculate key pair correctly even though its parameters are wrong?

1.5k views Asked by At

It seems that .NET (Core 3.1) supports custom curves in ECC. So I've defined the Curve25519, and generated key pair by below code:

using System;
using System.Security.Cryptography;

namespace Curve25519
{
    class Program
    {
        static void Main(string[] args)
        {
            ECCurve ecCurve = new ECCurve() // Curve25519, 32 bytes, 256 bit
            {
                CurveType = ECCurve.ECCurveType.PrimeMontgomery,
                B = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1 },
                A = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0x07, 0x6d, 0x06 }, // 486662
                G = new ECPoint()
                {
                    X = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 9 },
                    Y = new byte[] { 0x20, 0xae, 0x19, 0xa1, 0xb8, 0xa0, 0x86, 0xb4, 0xe0, 0x1e, 0xdd, 0x2c, 0x77, 0x48, 0xd1, 0x4c,
                    0x92, 0x3d, 0x4d, 0x7e, 0x6d, 0x7c, 0x61, 0xb2, 0x29, 0xe9, 0xc5, 0xa2, 0x7e, 0xce, 0xd3, 0xd9 }
                },
                Prime = new byte[] { 0x7f, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
                0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xed },
                //Prime = new byte[] { 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1 },
                Order = new byte[] { 0x10, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
                0x14, 0xde, 0xf9, 0xde, 0xa2, 0xf7, 0x9c, 0xd6, 0x58, 0x12, 0x63, 0x1a, 0x5c, 0xf5, 0xd3, 0xed },
                Cofactor = new byte[] { 8 }
            };

            using (ECDiffieHellman ecdhOwn = ECDiffieHellman.Create())
            {
                // generate the key pair
                ecdhOwn.GenerateKey(ecCurve);
                // save ECDiffieHellman implicit parameters including private key
                ECParameters ecdhParamsOwn = ecdhOwn.ExportParameters(true);
                // print key pair
                Console.WriteLine(BitConverter.ToString(ecdhParamsOwn.D) + "\r\n" + BitConverter.ToString(ecdhParamsOwn.Q.X) + "\r\n" + BitConverter.ToString(ecdhParamsOwn.Q.Y));
            }
        }
    }
}

A sample output is below:

90-54-A7-71-C0-03-D9-69-40-21-A4-CF-8C-81-7C-09-C4-CD-7A-44-77-2E-19-AD-B7-09-82-C9-AC-6E-AF-46
80-32-26-BD-C3-85-BC-35-17-98-B1-6C-C7-31-EF-BE-21-91-BA-CD-4A-BD-87-5B-FB-EC-4B-6B-02-C9-07-46
00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00

Then I wanted to cross check with another library/platform, namely x-cube-cryptolib/stm32f103c8. Given the private key generated by ECDiffieHellman (1st line), control library calculated the same public key (2nd line), validating the pair (hooray).

Before proceeding to key exchange phase, I wanted to play with it, and altered the parameters of Curve25519, as in commented-out prime at the code. I expected to see an error or calculation of different public keys from the same private key by the 2 platforms. But no, ECDiffieHellman always calculated key pairs that control library confirms. I altered curve parameters to be wrong, swapped or zeroed, and I did this for every parameter, cleaned and rebuilt the project, but the case was the same every time. Even when I proceeded to key exchange phase, ECDiffieHellman calculated the same shared secret key material as the control library.

Why does ECDiffieHellman/Curve25519 somehow generate correct key pairs and shared secret, which are coherent with the control library, even though its definition parameters are wrong, seemingly ignoring them? Or maybe this is about .Net Core's ECDH implementation?

1

There are 1 answers

2
Woodstock On BEST ANSWER

I don't know the libs you mention, but I do know a fair bit about curve25519.

ECDH is of course the act of taking a counterparts public key point (really k[G], where k is their private key (a clamped 256-bit number) and G is the curve's generator point), and multiplying it by your private key, yielding yourK * theirK * G.

This process is commutative which is why it works when the counterpart does the same with your public key and their private key.

Now, as to why the curve parameters seemingly don't matter. curve25519 is a highly highly optimised elliptic curve crypto-system. The scalar multiplication is optimised (variable base scalar multiplication used for ECDH), the point arithmetic is optimised, etc. The multiplication is executed using only the X-coordinate and differential additions. See here for details.

X25519 (curve25519+ECDH) exclusively uses "X-only" scalar multiplication, where points are represented only by their X coordinate. This is one of the fastest and simplest way to do key exchange in constant time, constant time is important for side-channel timing attacks.

The only time the curve is actually needed is when we execute EdDSA point decompression. EdDSA points wire format is made of the Y coordinate, and the sign of the X coordinate.

It's not that the curve is ignored, of course, elliptic curve operations must respect the underlying curve over which they operate and indeed the Galois field over which that curve is defined, it's more that the computation which is being used is guaranteed to stay on the curve by definition.

If you're zeroing all parameters, that's weird, but if you're still leaving the field (Prime in your case), as 2^255 - 19, this must be enough for the ECDH class to know what to do.

Thus, in short, I think it's likely not actually using the curve equation in the ECDH calculations.