I am trying to read the FPU context in SIGFPE handler.
With GDB, I set a breakpoint in SIGFPE handler, and check the ucontext for the FPU data, I got followings.
(gdb) p/x *ucontext->uc_mcontext.fpregs
$2 = {cwd = 0x37f, swd = 0x0, ftw = 0x0, fop = 0x0, rip = 0x0, rdp = 0x0, mxcsr = 0x1f80, mxcr_mask = 0xffff
....
And,
(gdb) p/x ucontext->__fpregs_mem
$3 = {cwd = 0xe670, swd = 0xffff, ftw = 0x7fff, fop = 0x0, rip = 0x0, rdp = 0x0, mxcsr = 0x37f, mxcr_mask = 0x0, _st = {{significand = {0x0,
0x0, 0x0, 0x0}, exponent = 0x0, padding = {0x0, 0x0, 0x0}}, {significand = {0x1f80, 0x0, 0xffff, 0x0},
To my surprise, the FP registers have different data, and it seemed that uc_mcontext.fpregs has the data from an offset of __fpregs_mem.
Why is there a difference, should these 2 fields include the same data, no ?
I am using Ubuntu 16.04 X86_64, gcc version 5.4.0 20160609, GLIBC 2.23-0ubuntu10
It is technically incorrect to call getcontext in a signal handler (read this) though I somewhat doubt that is the cause of your problems. If your goal is to determine which floating point error occurred then you can use this.
It might become easier to help if you post some code snippets.