A struct with bitfields, even when "packed", seems to treat a bitfield's size (and alignment, too?) based on the specified int type. Could someone point to a C++ rule that defines that behavior? I tried with a dozen of compilers and architectures (thank you, Compiler Explorer!) and the result was consistent across all.
Here's the code to play with: https://godbolt.org/z/31zMcnboY
#include <cstdint>
#pragma pack(push, 1)
struct S1{ uint8_t v: 1; }; // sizeof == 1
struct S2{ uint16_t v: 1; }; // sizeof == 2
struct S3{ uint32_t v: 1; }; // sizeof == 4
struct S4{ unsigned v: 1; }; // sizeof == 4
#pragma pack(pop)
auto f(auto s){ return sizeof(s); }
int main(){
f(S1{});
f(S2{});
f(S3{});
f(S4{});
}
The resulting ASM clearly shows the sizes returned by f()
as 1, 2, 4 for S1
, S2
, S3
respectively:
Nothing about
#pragma pack(push, 1)
is specified by the standard (other than#pragma
being specified as a pre-processor directive with implementation defined meaning). It is a language extension.This is what the standard specifies regarding bit fields:
It's essentially entirely implementation defined or unspecified.