int v_1[10] = { -1,1000,2 };
int a;
a = strlen(((char*)v_1 + 1));
I'm having trouble understanding why the 'a' variable is equal to 5 in this particular case.
I'm also struggling to grasp the behavior of the typecasting to char*. I don't undrstand if the are some calculation for the typecasting that i'm missing because i cannot get 5 in any way.
I attempted to modify the values in the array, but I can't discern what leads to the resulting outcome.
I am working on a Windows platform.
The value of
adepends a lot on what system you run this on. Presumably in this case, you're using a system wheresizeof intis equal to 4, and integers are stored in little-endian.A simple program can show you the contents of your data in hex:
Output (on a system with 4-byte little-endian integers):
Now, your question is about why
strlen(((char*)v_1 + 1));is equal to 5. To answer that, look at the output again:You can see that
strlenis going to step over five bytes (ff ff ff e8 03) before reaching a NUL byte, which is what delimits the end of a string. And so that is why, on your system, you get the value 5.