Opengl texture upload of 16 bit data to not normalize

536 views Asked by At

I have 2 scenarios, one where there is a floating point array of data and another where there is a 16bit short data. I'm uploading the texture data to the graphics card using the following:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, width, height, 0, GL_RGBA, GL_FLOAT, floatData);

and in the other case of the short

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, width, height, 0, GL_RGBA, GL_SHORT, shortData);

The problem I have is lets say I have a value of 512 stored. When I upload the floatData array I get the expected 512 value inside the shader. However when I use the shortData a value of 512 gets normalized to 512/65535.

So my question is there a way to instruct openGL to take my shortData array data and not do normalization upon it?

0

There are 0 answers