I have written a tile-based engine - everything looks fine as long as antialiasing is disabled in WebGL.
When antialiasing is enabled, sometimes pixels of the edge of tiles will get rendered which mostly show up as background pixels and fill in the depth buffer appropriately. When a much more intense (higher alpha) pixel comes in, it gets discarded due to the depth buffer.
I did attempt to disable the depth buffer, and "set the blending factors to GL_SRC_ALPHA_SATURATE (source) and GL_ONE (destination)" as documented here: http://www.glprogramming.com/red/chapter06.html . This lead to purely white pixels being drawn - not sure what's going on there.
The graphics on my textures are all padded so that WebGL should not have issues with filtering. I've made the padding extreme to ensure this was not a contributing factor.
I've explored glSampleCoverage()
and tried a few shots in the dark with random values producing nothing of worth. I'm unfamiliar with what it actually does and can't find any good examples online, except a few abstract suggestions.
I'd rather not disable the depth buffer.
GL_MULTISAMPLE
is not available.
I have an Emscripten with C++ environment - but that doesn't matter, I can write inline javascript if needed - but I figure there's generic OpenGL solutions to this.
Is there any way to disable antialiasing in WebGL for only certain geometry?
Good:
Bad:
Any documentation I can find on WebGL suggests that it only supports full screen Anti-Aliasing with the built in AA facilities.
The main problem in your case is that many AA algorithms are "post-processing", to be performed once all your geometry has rendered. They only know about the pixels, so to speak.
A possible solution would be to write your own Anti-Aliasing which works off geometry information instead if available or can work within the context of a single piece of geometry.
http://www.humus.name/index.php?page=3D&ID=87 http://en.wikipedia.org/wiki/Supersampling