I have been making a 3D software renderer in C, so far I have been using wireframes, however now I have tried to make a rasterizer with depth buffering. I am using barycentric coordinates to rasterize triangles.
However for some reason, when I rasterize it just doesn't draw anything to almost half of the screen. After some testing I have found out that the problem is caused by how I change the triangle's bounding box to be within the screen whenever I rasterize one. I also have another problem with the depth buffering, for some reason it still doesn't draw the pixels in the correct order despite the depth buffer.
Here is the code:
bool PointInTriangle(Vector2 A, Vector2 B, Vector2 C, Vector2 P)
{
float s1 = C.y - A.y;
float s2 = C.x - A.x;
float s3 = B.y - A.y;
float s4 = P.y - A.y;
float w1 = (A.x * s1 + s4 * s2 - P.x * s1) / (s3 * s2 - (B.x - A.x) * s1);
float w2 = (s4 - w1 * s3) / s1;
if (w1 >= 0 && w2 >= 0 && (w1 + w2) <= 1) {
return true;
}
else {
return false;
}
}
void InterpolateTriangle(struct Triangle triangle, const int Screen_Width, const int Screen_Height, unsigned long *depthBuffer, int camZ) {
int minY = fmin(triangle.v2.projPosition.y, fmin(triangle.v0.projPosition.y, triangle.v1.projPosition.y));
int maxY = fmax(triangle.v2.projPosition.y, fmax(triangle.v0.projPosition.y, triangle.v1.projPosition.y));
int minX = fmin(triangle.v2.projPosition.x, fmin(triangle.v0.projPosition.x, triangle.v1.projPosition.x));
int maxX = fmax(triangle.v2.projPosition.x, fmax(triangle.v0.projPosition.x, triangle.v1.projPosition.x));
if (maxY < 0) return;
if (minY < 0) minY = 0;
if (minY > Screen_Height) return;
if (maxY > Screen_Height) maxY = Screen_Height;
if (maxX < 0) return;
if (minX < 0) minX = 0;
if (minX > Screen_Width) return;
if (maxX > Screen_Width) maxX = Screen_Width;
for (int y = minY; y < maxY; y++) {
for (int x = minX; x < maxX; x++) {
if (PointInTriangle(triangle.v0.projPosition, triangle.v1.projPosition, triangle.v2.projPosition, (Vector2){ (float)x, (float)y })) {
double lambda0 = ((double)x - triangle.v0.projPosition.x) / (triangle.v1.projPosition.x - triangle.v0.projPosition.x);
double lambda1 = ((double)x - triangle.v1.projPosition.x) / (triangle.v2.projPosition.x - triangle.v1.projPosition.x);
double lambda2 = ((double)x - triangle.v2.projPosition.x) / (triangle.v0.projPosition.x - triangle.v2.projPosition.x);
double z = 1.0 / triangle.v0.position.z * lambda0 + 1.0 / triangle.v1.position.z * lambda1 + 1.0 / triangle.v2.position.z * lambda2;
z = 1.0 / z;
z += (double)camZ;
int currentIndex = Screen_Width * y + x;
if (depthBuffer[currentIndex] > (unsigned long)z) {
depthBuffer[currentIndex] = (unsigned long)z;
DrawPixel(x, y, triangle.color);
}
}
}
}
}
For depth buffering I use the equation shown in this website: https://www.scratchapixel.com/lessons/3d-basic-rendering/rasterization-practical-implementation/visibility-problem-depth-buffer-depth-interpolation.html
And here is an image of the problem:
To fix the problem I have tried not changing the bounding box, however that causes the program to crash, even when I comment out the depth buffering to not overflow the depth buffer. I have also tried only removing maxX which fixed the problem, however it is way slower and also the zFar plane is way closer when I do that and I can't seem to find why.