There are the following programs:
#include <stdio.h>
int main(void)
{
int i=2147483647;
unsigned int j=4294967295;
printf("%d %d %d\n",i,i+1,i+2);
printf("%u %u %u\n",j,j+1,j+2);
return 0;
}
Why i+2 is not equal to -2147483646 ?
why j+2 is not equal to 2
It's the result that I expected was different. What is its execution process like?
EDIT
The result I get is:
- i=2147483647
- i+1=-2147483648
- i+2=-2147483647
- j=4294967295
- j+1=0
- j+2=1
If you will output the value of
j
in the hexadecimal notation like for exampleYou will get the following output
So adding
1
to0xffffffff
you will get0x00000000
. Again adding1
you will get0x00000001
.From the C Standard (6.2.5 Types)
As for the signed integer variable
i
then in general the result is undefined due to the overflow.If the internal representation of integers is two's complement representation then implementations can silently wrap-around on overflow. In this case for signed integer you will have
The output is
That is the hexadecimal representation of an object of the type int
0x80000000
yields the minimal value stored in the object (the sign bit is set). The representation0x80000001
yields the value that follows the minimal value.