I am currently working on my first OpenGL based game engine. I need normal mapping as a feature, but it isn't working correctly.
Here is an animation of what is Happening
The artifacts are affected by the angle between the light and the normals on the surface. Camera movement does not affect it in any way. I am also (at least for now) going the route of the less efficient method where the normal extracted from the normal map is converted into view space rather than converting everything to tangent space.
Here are the relevant pieces of my code:
Generating Tangents and Bitangents
for(int k=0;k<(int)mb->getIndexCount();k+=3)
{
unsigned int i1 = mb->getIndex(k);
unsigned int i2 = mb->getIndex(k+1);
unsigned int i3 = mb->getIndex(k+2);
JGE_v3f v0 = mb->getVertexPosition(i1);
JGE_v3f v1 = mb->getVertexPosition(i2);
JGE_v3f v2 = mb->getVertexPosition(i3);
JGE_v2f uv0 = mb->getVertexUV(i1);
JGE_v2f uv1 = mb->getVertexUV(i2);
JGE_v2f uv2 = mb->getVertexUV(i3);
JGE_v3f deltaPos1 = v1-v0;
JGE_v3f deltaPos2 = v2-v0;
JGE_v2f deltaUV1 = uv1-uv0;
JGE_v2f deltaUV2 = uv2-uv0;
float ur = deltaUV1.x * deltaUV2.y - deltaUV1.y * deltaUV2.x;
if(ur != 0)
{
float r = 1.0 / ur;
JGE_v3f tangent;
JGE_v3f bitangent;
tangent = ((deltaPos1 * deltaUV2.y) - (deltaPos2 * deltaUV1.y)) * r;
tangent.normalize();
bitangent = ((deltaPos1 * -deltaUV2.x) + (deltaPos2 * deltaUV1.x)) * r;
bitangent.normalize();
tans[i1] += tangent;
tans[i2] += tangent;
tans[i3] += tangent;
btans[i1] += bitangent;
btans[i2] += bitangent;
btans[i3] += bitangent;
}
}
Calculating the TBN matrix in the Vertex Shader (mNormal corrects the normal for non-uniform scales)
vec3 T = normalize((mVW * vec4(tangent, 0.0)).xyz);
tnormal = normalize((mNormal * n).xyz);
vec3 B = normalize((mVW * vec4(bitangent, 0.0)).xyz);
tmTBN = transpose(mat3(
T.x, B.x, tnormal.x,
T.y, B.y, tnormal.y,
T.z, B.z, tnormal.z));
Finally here is where I use the sampled normal from the normal map and attempt to convert it to view space in the Fragment Shader
fnormal = normalize(nmapcolor.xyz * 2.0 - 1.0);
fnormal = normalize(tmTBN * fnormal);
"nmapcolor" is the sampled color from the normal map. "fnormal" is then used like normal in the lighting calculations.
I have been trying to solve this for so long and have absolutely no idea how to get this working. Any help would be greatly appreciated.
EDIT - I slightly modified the code to work in world space and outputted the results. The big platform does not have normal mapping (and it works correctly) while the smaller platform does.
I added in what direction the normals are facing. They should both be generally the same color, but they're clearly different. Seems the mTBN matrix isn't transforming the tangent space normal into world (and normally view) space properly.
Well... I solved the problem. Turns out my normal mapping implementation was perfect. The problem actually was in my texture class. This is, of course, my first time writing an OpenGL rendering engine, and I did not realize that the unlock() function in my texture class saved ALL my textures as GL_SRGB_ALPHA including normal maps. Only diffuse map textures should be GL_SRGB_ALPHA. Temporarily forcing all textures to load as GL_RGBA fixed the problem. Can't believe I had this problem for 11 months, only to find it was something so small.