I'm using lldb from Terminal to debug a C++ program. I can use it for most things except it won't print certain characters.
At some point my program receives a byte array from a remote system (vector<unsigned char>
) and when I print that value I get the following:
[10] = '\0'
[11] = '\0'
[12] = 'O'
[13] = '\b'
[14] = '\0'
[15] = '\x01'
[16] = '\0'
[17] = '\0'
[18] = '\0'
[19] = '\x17'
[20] = '\0'
[21] = '\x04'
[22] = '?'
[23] = '?'
[24] = '?'
[25] = '?'
[26] = '\0'
[27] = '\0'
[28] = 'n'
[29] = 'a'
[30] = 'm'
[31] = 'e'
[32] = '\0'
[33] = '\0'
[34] = '\0'
Using Xcode I can see that the four ?
marks have been turned to four \xff
. Is there a way to make lldb print these characters correctly without Xcode?
If you want to see the hex values of elements of a character array, you can use lldb's emulation of the gdb formatting synatax:
To see all the formatting options available do:
BTW, I'm not sure it's right to say this is "printing them correctly...". If I have a character array with some 0xff values in it, printf prints them as '?'. That's the character representation of this value. By default lldb prints char variables by interpreting them as printf would. You can change this default to hex if that suits you better using lldb's data formatters. See:
http://lldb.llvm.org/varformats.html
for more details.