How to convert a char array to a uint16_t by casting type pointer?

8.5k views Asked by At
char bytes[2];
bytes[0] = 0; //0x00
bytes[1] = 24; //0x18
uint16_t* ptrU16 = (uint16_t*)bytes; // I expect it points to the memory block: 0x18
cout << *ptrU16 << endl;  // I expect 24, but it is 6144

What's wrong in my code?

2

There are 2 answers

2
Crowman On BEST ANSWER

You have a little endian machine. 6144 is 0x1800. When your machine represents the 16 bit value 0x0018 in memory, it puts the 0x18 byte first, and the 0x00 byte second, so when you interpret the two byte sequence 0x0018 as a uint16_t, it gives you 6144 (i.e. 0x1800), and not 24 (i.e. 0x0018).

If you change to:

bytes[0] = 24; 
bytes[1] = 0;

you'll likely see the result you expect.

If you really want to get the result you expect, then you'll either have to compute it manually, such as:

uint16_t n = (bytes[1] << 8) + bytes[0];

or, more generally:

char bytes[] = {0x18, 0x00};
uint16_t n = 0;
for ( size_t i = 0; i < 2; ++i ) {
    n += bytes[i] << 8 * i;
}
std::cout << n << std::endl;

or you can use a function like ntohs(), since network byte-order is big endian.

0
Charlie On

You may want to look into the ntohs() function. ("Network to Host byte order conversion"). You've plugged in your data in big endian mode, which is traditionally also network byte order. No matter what host you're on, the ntohs() function should return the value you're expecting. There's a mirror function for going from host to network order.

#include <arpa/inet.h>
...
cout << htons(*ptrU16) << endl;

should work and be portable across systems. (i.e. should work on Power, ARM, X86, Alpha, etc).