char bytes[2];
bytes[0] = 0; //0x00
bytes[1] = 24; //0x18
uint16_t* ptrU16 = (uint16_t*)bytes; // I expect it points to the memory block: 0x18
cout << *ptrU16 << endl; // I expect 24, but it is 6144
What's wrong in my code?
You may want to look into the ntohs()
function. ("Network to Host byte order conversion"). You've plugged in your data in big endian mode, which is traditionally also network byte order. No matter what host you're on, the ntohs() function should return the value you're expecting. There's a mirror function for going from host to network order.
#include <arpa/inet.h>
...
cout << htons(*ptrU16) << endl;
should work and be portable across systems. (i.e. should work on Power, ARM, X86, Alpha, etc).
You have a little endian machine.
6144
is0x1800
. When your machine represents the 16 bit value0x0018
in memory, it puts the0x18
byte first, and the0x00
byte second, so when you interpret the two byte sequence0x0018
as auint16_t
, it gives you6144
(i.e.0x1800
), and not24
(i.e.0x0018
).If you change to:
you'll likely see the result you expect.
If you really want to get the result you expect, then you'll either have to compute it manually, such as:
or, more generally:
or you can use a function like
ntohs()
, since network byte-order is big endian.