why the pixel set by setpixel returns a different value in getpixel ?

560 views Asked by At

here is my scenario:

I'm using color.fromargb in C# to get a color out of an int upto 800 (which can be reversed by toargb).

I'm setting a certain pixel( eg: 0,0) in a bitmap to that color and saving it as jpg and when using getpixel to get that color and that int back, I receive a negative value which has puzzled me. any thoughts and suggestion ?

1

There are 1 answers

1
chillitom On BEST ANSWER

I suspect two things are at play here.

Firstly JPEG is a lossy format so the number you put in might not be the exact number you get out, a true black is likely to become a gray.

Secondly, why do you get a negative when you start with a positive number? Well this is all down to the way int and uint and colours are represented in binary.

In RGB notation black has hex value #ffffff in ARGB it is #ffffffff.

0xffffffff in hex is 4,294,967,295 in decimal.

However an int is a signed type, meaning it is represented in two's complement representation

If the highest bit in the bit in the number is set then you need to deduct -2,147,483,648; that is, hexadecimal 0x80000000.

So 0xffffffff becomes 2,147,483,647 - 2,147,483,648 = -1.

If your black became a slight gray like #ffeeeeee then its decimal value in two's complement notation would be -2,146,365,166.