Opengl textures and endianness

3.4k views Asked by At

I'm using glTexSubImage2D with GL_LUMINANCE and GL_UNSIGNED_BYTE to display raw greyscale data from a camera directly - rather than having to repack it into an RGB bitmap manually.

I would like to run the camera in higher resolution mode, with 12 or 14bits/ pixel.

I can do this with simply by setting GL_SHORT but the camera returns data in big endian and my openGL implementation seems to be drawing it the wrong way around (on x86).

Is there a simple way of telling openGL that the textures are the 'wrong' way round? I would like to avoid manually byteswaping the data just for display because all the other functions expect big endian data.

1

There are 1 answers

3
luke On BEST ANSWER

Check out the glPixelStore* group of functions.

You might need to play with GL_UNPACK_SWAP_BYTES or GL_UNPACK_LSB_FIRST but double check you're using the correct GL_UNPACK_ALIGNMENT. By default the unpack alignment is 4, but if you're using one byte per pixel (lum / ub), you'll want to set that to 1. I ran into this problem just recently, and it took me longer to figure out than I'd care to admit :)