Why doesn't gcc 13 display the correct binary represenation?

185 views Asked by At

While answering a question here, I made the following example:

#include <stdio.h>
#include <math.h>

int main (void) 
{
  float_t a = -248.75;
  printf("%f\n", a);

  unsigned char* ptr = (unsigned char*)&a;
  for(size_t i=0; i<sizeof(a); i++)
  {
    printf("%.2X ", ptr[i]);
  }
}

On gcc before version 13 as well as all versions of clang, this gives the expected output (x86 little endian):

-248.750000
00 C0 78 C3 

However, when compiling with gcc 13.1, I get nonsense output:

-248.750000
00 00 00 00 

Further examination shows that the culprit is the option -std=c2x. If I remove it, the program behaves as expected.

Compiler options used: -std=c2x -pedantic-errors -Wall -Wextra -O3 -lm. https://godbolt.org/z/4qbo74eEW

Is this a known bug in gcc?

1

There are 1 answers

1
emacs drives me nuts On BEST ANSWER

This is now filed as a GCC bug, cf. https://gcc.gnu.org/PR111884

v11 and v12 seem to work for me.

Edit: This bug is fixed now in v13.3+