The upcoming C23 Standard adds a keyword _BitInt()
which can be used, as I understand, to define an integer with a specific number of bits. However I could not find much information with regards to the in-memory representation of types declared this way, and any behavior that relates to their in-memory representation such as their size or alignment.
As such, is there any difference in terms of behavior, representation, or alignment requirements between _BitInt()
types and 'real' integer types of the same bit width? For example, between _BitInt(32)
and int32_t
or int_least32_t
? And is it well-defined to type-pun between them?