Here is my problem, when I load in textures I load them in as SRGB to convert them into linear space. When I was writing to the default frame buffer provided by the windowing system I enabled GL_FRAMEBUFFER_SRGB so that writing to the framebuffer would convert the colours from linear space to SRGB space.
The problem now is that I'm rendering into an offscreen FBO, and then blitting that into the default framebuffer. When doing this turning GL_FRAMEBUFFER_SRGB on or off has no effect. I've tried enabling GL_FRAMEBUFFER_SRGB both with the default framebuffer bound, and with the offscreen FBO bound, they don't work.
What does work however is if I specify GL_SRGB as the internal format of the texture that I bind as the colour buffer of the offscreen FBO. I'm assuming in this case when GL_FRAMEBUFFER_SRGB is enabled the writes from the fragment shader to the texture convert it from linear space to SRGB space.
I was wondering, it seems that blitting from the FBO to the default framebuffer the GL_FRAMEBUFFER_SRGB conversion doesn't get applied. As I would like to work in linear space until the final transfer to the backbuffer of the default framebuffer, how can I apply the conversion as I blit the offscreen FBO to the default framebuffer? I guess that sampling the offscreen FBO as a texture and rendering a quad on the default framebuffer would apply the SRGB conversion, but is there no way of doing this conversion with a framebufferblit?
GL_FRAMEBUFFER_SRGB
does not what you think it does: It does not turn any frambuffer format from RGB to sRGB.GL_FRAMEBUFFER_SRGB
will only have any effect when the format of a framebuffer already is specifiedsRGB
, e.g. by attaching a texture withinternalFormat
ofGL_SRGB8
orGL_SRGB8_ALPHA8
as a color buffer to an FBO. If you want sRGB conversion for the default framebuffer, you must use the window-system specific APIs to explicitely create a PixelFormat/Visual/FBConfig/whatever with sRGB support. Have a look at theGL / GLX / WGL _ARB_FRAMEBUFFER_SRGB
extension spec on how to do that.Yes, this is exactly how it is supposed to work. The
GL_FRAMEBUFFER_SRGB
enable bit is only a means to disable the SRGB conversions on formats which normally would do them, not the other way around.Let's have a look at the most recent OpenGL specification to date: OpenGL 4.6 core profile specification, section 18.3.1 "Blitting Pixel Rectangles" (emphasis mine):
with section 17.3.7 "sRGB Conversion" stating:
Note that section 17.3.7 will make clear that when writing to the color buffer, linear-to-sRGB conversion will only be applied if
GL_FRAMEBUFFER_SRGB
is enabled.So this leaves us with the following possibilities:
Both source and destination color buffer have a standard NON-sRGB format: The blit will copy the pixel values, no sRGB conversions can occur, no matter how
GL_FRAMEBUFFER_SRGB
is set.The source buffer has sRGB format, the destination has NON-sRGB format. Conversion from sRGB to linear will be done if, and only if
GL_FRAMEBUFFER_SRGB
is enabled.The source buffer has NON-sRGB format, the destination has sRGB format. Conversion from linear to sRGB will be done if, and only if
GL_FRAMEBUFFER_SRGB
is enabled.Both source and destination have sRGB formats. Conversion from sRGB to linear AND linear to sRGB will be done if, and only if
GL_FRAMEBUFFER_SRGB
is enabled. Note that the conversion might boil down to a no-op here. This includes cases with bilinear filtering, too: The GL spec does not require bilinear filtering on sRGB sources to be applied in linear space after the conversion, it may as well be applied before.Now, the story could be over here. But it isn't. The behavior of framebuffer blits with sRGB formats has undergone a number of changes in the GL history.
Section 18.3.1 OpenGL 4.3 core profile specification does state:
This means that up to GL 4.3, conversion of the source buffer was always done when it had an sRGB format, no matter the setting of
GL_FRAMEBUFFER_SRGB
enable. For the destination buffer, this setting still is relevant, though.Now in OpenGL 3.3, there is no mention of the behavior on blitting with sRGB formats. The relevant section is 4.3.2 "Copying Pixels" and only states:
This means that it also will bypass linear-to-sRGB conversion for the destination buffer, and makes no statement about the source conversion at all.
Also not that drivers historically also have ignored the spec and did what they think is best when it comes to sRGB conversions. See for example the artice "March 2015 OpenGL drivers status and FB sRGB conversions". There is also this nice patch to the piglet OpenGL test suite talking about issues and presenting a somewhat sad conculsion:
However, in my experience, most current GL >= 4.4 drivers do the conversions as specified, so the situation isn't that bad any more. But I wouldn't bet my life on it, either.