I know modern OSs such as Linux don't always execute an application at the same address it was originally linked. When a debugger starts looking around though, it needs to know the relationship between the original link address and the final executing address. How does GDB calculate the offset?
Clarifications: I'm not talking about virtual memory. That is, I have (what I believe to be) a reasonable understanding of how Virtual memory works and am operating entirely with in that address space. I have symbols that are at one location when I dump the symbol table from the ELF, but at another location when I get their address out of memory.
In this particular case, I have a string which in the linked executable is at address 0x0E984141. In a dump of memory from that process, it is at address 0x0E3F2781. Everything in the .rodata section at least has been shifted by 0x5919C0. It appears to be something like Address Space Layout Randomization.
This is only possible for position-independent executables (linked with
-pie
flag).Correct.
The same way GDB calculates the offset for shared libraries (a
PIE
executable is really a special case of a shared library). There is a defined interface betweenld.so
and GDB, consisting of_dl_debug_state()
function (on which GDB sets an internal breakpoint, and whichld.so
calls whenever it maps a newELF
image into the process), andstruct r_debug
. The latter points to a linked list ofstruct link_map
s, andl_addr
member of that struct is the offset between linked-at and loaded-at address.