OpenGL GLSL Send color as integer to shader to be decomposed as vec4 RGBA

6k views Asked by At

I can send color to shader as 4 floats - no problem. However I want to send it as integer (or unsigned integer, doesn't really matter, what matters is 32 bits) and be decomposed in vec4 on shader.

I'm using OpenTK as C# wrapper for OpenGL (although it should be pretty much just a direct wrapper).

Let's consider one of the most simple shaders with vertex containing position(xyz) and color(rgba).

Vertex shader:

#version 150 core

in vec3 in_position;
in vec4 in_color;
out vec4 pass_color;
uniform mat4 u_WorldViewProj;

void main()
{
    gl_Position = vec4(in_position, 1.0f) * u_WorldViewProj;
    pass_color = in_color;
}

Fragment shader:

#version 150 core

in vec4 pass_color;
out vec4 out_color;

void main()
{
    out_color = pass_color;
}

Let's create vertex buffer:

public static int CreateVertexBufferColor(int attributeIndex, int[] rawData)
{
    var bufferIndex = GL.GenBuffer();
    GL.BindBuffer(BufferTarget.ArrayBuffer, bufferIndex);
    GL.BufferData(BufferTarget.ArrayBuffer, sizeof(int) * rawData.Length, rawData, BufferUsageHint.StaticDraw);
    GL.VertexAttribIPointer(attributeIndex, 4, VertexAttribIntegerType.UnsignedByte, 0, rawData);
    GL.EnableVertexAttribArray(attributeIndex);
    GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
    return bufferIndex;
}

And I'm getting all zeros for vec4 'in_color' in vertex shader. Not sure what's wrong.

Closest thing what I found: https://www.opengl.org/discussion_boards/showthread.php/198690-I-cannot-send-RGBA-color-as-unsigned-int .

Also in VertexAttribIPointer I'm passing 0 as a stride, because I do have VertexBufferArray and keep data separated. So colors come tightly packed 32 bits (per color) per vertex.

2

There are 2 answers

0
chainerlt On BEST ANSWER

Aight, so this did the job for me:

Having managed data as int[] where it's tightly packed array of only colors (where int format is: RGBA meaning 0xAABBGGRR), then defining vertex attribute as: GL.VertexAttribPointer(index, 4, VertexAttribPointerType.UnsignedByte, true, sizeof(int), IntPtr.Zero) and using it in shader as: in vec4 in_color;.

5
Rhu Mage On

You have to use VertexAttribPointer and not VertexAttribIPointer when the input in your shader are floats (vec4).

Set the normalized parameter to GL_TRUE.

Spec says:

glVertexAttribPointer, if normalized is set to GL_TRUE, it indicates that values stored in an integer format are to be mapped to the range [-1,1] (for signed values) or [0,1] (for unsigned values) when they are accessed and converted to floating point. Otherwise, values will be converted to floats directly without normalization.