Function to XOR two 128 bits. How do I generate 128 bit values?

1k views Asked by At

I'm trying to learn simple cryptology and as a starter I'm trying to achieve the following.

A function, taking two 128 bit params (key and plaintext) as input and returning their XOR. I know XOR is not safe but I'm starting out with a simple example.

This is what I have tried:

class Program

    static void Main(string[] args)
    {
        string key = "B25829846AED8"; //128 bits??
        string plaintext = "A9BB51625ECBE"; //128 bits??

        //Convert key to byte array
        byte[] keyBytes = new byte[key.Length * sizeof(char)];
        System.Buffer.BlockCopy(key.ToCharArray(), 0, keyBytes, 0, keyBytes.Length);

        //Convert plaintext to byte array
        byte[] plaintextBytes = new byte[plaintext.Length * sizeof(char)];
        System.Buffer.BlockCopy(plaintext.ToCharArray(), 0, plaintextBytes, 0, plaintextBytes.Length);

        //Encrypt (XOR)
        string result = new Encrypter().encrypt(keyBytes, plaintextBytes);
    }
}

Encrypter.cs :

class Encrypter
{

    public string encrypt(byte[] key, byte[] plaintext)
    {
        BitArray keyBits = new BitArray(key);
        BitArray plaintextBits = new BitArray(plaintext);

        if(keyBits.Length == plaintextBits.Length)
        {
            BitArray result = keyBits.Xor(plaintextBits);
            return result.ToString();
        }

        return null;
    }
}

My problem:

I'm struggling with what to put as the key and plaintext. How can I ensure that the values are exactly 128 bit each?

E.g. B25829846AED8 is apparently a 128 bit WEP key. If I assign this to my key variable and when I enter the encrypt method the keyBits.Length property has the value 208. This is what I don't get. Also the parameter key has the length 26, which I'm also confused by.

2

There are 2 answers

0
Daniel Hesslow On BEST ANSWER

Why is the key-length 26?

C#-strings is in unicode, so you can write all characters out there eg. Chinese, Japanese, etc. etc. That takes two bytes (in utf-16). 13*2=26.

Is your wep-key 128 bits You've got a key for 128 bit wep-protocoll which is using 104 bit keys. (fun times) Wiki on Wep

But as far as I understand you're not trying to implement wep, you're trying to encode something. Take two random integers translate them to bytes and put them after each other. BAM- 128 bits :)

using System.Linq 

byte[] key = BitConverter.GetBytes(25).Concat(BitConverter.GetBytes(284)) ;

Other than that you seam to have it under control, good luck :)

0
T_D On

You want to use 128 bit keys or in other words 16 bytes. Strings are made of chars and the char datatype in C# is uses 2 bytes (16 bits). So you could make a 16 byte key from strings of length 8, which is sort of problematic because it is difficult to use the full 128bit range due to unprintable characters and so on. It would be way easier to represent the key as a byte array with length 16 from the start for example: byte[] key = {1, 8, 255, 12, 2, 1, 1, 1, 1, 1, 1, 2, 3, 4, 5, 5};

You say that B25829846AED8 is a 128bit key. Interpreted as a string this is not true: 13 chars = 26 bytes = 208 bit so that is the explanation for your result. Interpreting each character as a hexadecimal digit this key would be 13*4 = 52bit. Interpreting each character as a ANSI character (size 8bit) would be 13*8 = 104 bit.

So to produce the byte array for the key from a string or number you have to define how you interpret the string or number. As already said above, easiest would be to enter 16 bytes directly.