Java AES Encryption - Sending initialization vector between Client and Server

6k views Asked by At

I'm generating the initialization vector on the Client side where the messages are encrypted then sent to server together with the vector for decrypt.

Client Code:

            String key1 = "1234567812345678";
        byte[] key2 = key1.getBytes();


        SecretKeySpec secret = new SecretKeySpec(key2, "AES");
        Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");

        cipher.init(Cipher.ENCRYPT_MODE, secret);

        byte[] encrypted = cipher.doFinal(msg.getBytes(StandardCharsets.UTF_8));
        byte[] iv = cipher.getIV();

        String text = DatatypeConverter.printBase64Binary(encrypted);

        System.out.println("Encrypted info: " + text);

        bytebuf = ByteBuffer.allocate(1024);
        bytebuf.clear();

        // send iv

        bytebuf.put(iv);
        bytebuf.flip();
        while(bytebuf.hasRemaining()) {
            nBytes += client.write(bytebuf);
            System.out.println("Iv sent!");
        }

        bytebuf.clear();
        bytebuf.put(text.getBytes());

        bytebuf.flip();

        while(bytebuf.hasRemaining()) {
            nBytes += client.write(bytebuf);
        }

Server Code

            LOGGER.info("Confirming write");

        byte[] iv = buf.array();

        LOGGER.info("Data packet found as {}", iv);


        LOGGER.info("Confirming write");
        String data = new String(buf.array());

        LOGGER.info("Data packet found as {}", data);


        IvParameterSpec ivspec = new IvParameterSpec(iv);
        String key1 = "1234567812345678";
        byte[] key2 = key1.getBytes();
        SecretKeySpec secret = new SecretKeySpec(key2, "AES");
        Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");


        cipher.init(Cipher.DECRYPT_MODE, secret, ivspec);

        byte[] encrypted = DatatypeConverter.parseBase64Binary(data);
        byte[] decrypted = cipher.doFinal(encrypted);

        System.out.println("Decrypted Info: " + new String(decrypted, StandardCharsets.UTF_8));

I get the following exception:

java.security.InvalidAlgorithmParameterException: Wrong IV length: must be 16 bytes long

It seems that if I allocate 1024 bytes for example to the buffer, a 32 sized byte[] is sent to the server, but a 1024 sized byte[] will be generated on the server:

Data packet found as [-55, 119, 34, -19, -33, -20, -67, -77, 54, -111, 14, 94, 73, 98, 34, -7, 0, 0, 0, 0, 0, 0,..................

Am I even on the right path?

1

There are 1 answers

0
pelican_george On

Instead of allocating 1024 bytes, I had to allocate 16 for the Initialization Vector instead.

ByteBuffer buf = ByteBuffer.allocate(16);

Forgot to read the socketchannel for the following messages:

            buf = ByteBuffer.allocate(32);
        buf.clear();
        socket.read(buf);