I came across an issue about occlusion query. In my case, the query result seems always larger than it is supposed to be. e.g. i use a 1024*1024 resolution for rendering, but the query result of an object in the scene is 2085029(>1024*1024).
The query method used is from GPU Gems Chapter 29
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glDepthMask(GL_FALSE);
glBeginQuery(GL_SAMPLES_PASSED, occlusionQuery[0]);
mesh->Render();
glEndQuery(GL_SAMPLES_PASSED);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glDepthMask(GL_TRUE);
glGetQueryObjectuiv(occlusionQuery[0], GL_QUERY_RESULT, &screenFragmentCount[0]);
Can anyone help?
The occlusion query does not tell you how many pixels passed the tests. It tells you how many fragments passed the tests (or more accurately, "samples").
If you draw one triangle, and then draw a triangle in front of it, the occlusion query counts fragments from both, even though the earlier triangles are occluded by the later ones. The triangles are rasterized in-order. So the later triangle that covers the earlier one has not been rasterized by the time the occlusion query count is bumped. Self-occlusion like this is only counted if triangles are rendered from nearest to farthest.