Does OpenGL framebuffer blitting take into account gamma correction when enabled?

1.7k views Asked by At

Here is my problem, when I load in textures I load them in as SRGB to convert them into linear space. When I was writing to the default frame buffer provided by the windowing system I enabled GL_FRAMEBUFFER_SRGB so that writing to the framebuffer would convert the colours from linear space to SRGB space.

The problem now is that I'm rendering into an offscreen FBO, and then blitting that into the default framebuffer. When doing this turning GL_FRAMEBUFFER_SRGB on or off has no effect. I've tried enabling GL_FRAMEBUFFER_SRGB both with the default framebuffer bound, and with the offscreen FBO bound, they don't work.

What does work however is if I specify GL_SRGB as the internal format of the texture that I bind as the colour buffer of the offscreen FBO. I'm assuming in this case when GL_FRAMEBUFFER_SRGB is enabled the writes from the fragment shader to the texture convert it from linear space to SRGB space.

I was wondering, it seems that blitting from the FBO to the default framebuffer the GL_FRAMEBUFFER_SRGB conversion doesn't get applied. As I would like to work in linear space until the final transfer to the backbuffer of the default framebuffer, how can I apply the conversion as I blit the offscreen FBO to the default framebuffer? I guess that sampling the offscreen FBO as a texture and rendering a quad on the default framebuffer would apply the SRGB conversion, but is there no way of doing this conversion with a framebufferblit?

1

There are 1 answers

5
derhass On

When doing this turning GL_FRAMEBUFFER_SRGB on or off has no effect. I've tried enabling GL_FRAMEBUFFER_SRGBboth with the default framebuffer bound, and with the offscreen FBO bound, they don't work.

GL_FRAMEBUFFER_SRGB does not what you think it does: It does not turn any frambuffer format from RGB to sRGB. GL_FRAMEBUFFER_SRGB will only have any effect when the format of a framebuffer already is specified sRGB, e.g. by attaching a texture with internalFormat of GL_SRGB8 or GL_SRGB8_ALPHA8 as a color buffer to an FBO. If you want sRGB conversion for the default framebuffer, you must use the window-system specific APIs to explicitely create a PixelFormat/Visual/FBConfig/whatever with sRGB support. Have a look at the GL / GLX / WGL _ARB_FRAMEBUFFER_SRGB extension spec on how to do that.

What does work however is if I specify GL_SRGB as the internal format of the texture that I bind as the colour buffer of the offscreen FBO. I'm assuming in this case when GL_FRAMEBUFFER_SRGB is enabled the writes from the fragment shader to the texture convert it from linear space to SRGB space.

Yes, this is exactly how it is supposed to work. The GL_FRAMEBUFFER_SRGB enable bit is only a means to disable the SRGB conversions on formats which normally would do them, not the other way around.

I was wondering, it seems that blitting from the FBO to the default framebuffer the GL_FRAMEBUFFER_SRGB conversion doesn't get applied.

Let's have a look at the most recent OpenGL specification to date: OpenGL 4.6 core profile specification, section 18.3.1 "Blitting Pixel Rectangles" (emphasis mine):

When values are taken from the read buffer, if FRAMEBUFFER_SRGB is enabled and the value of FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the read buffer is SRGB (see section 9.2.3), the red, green, and blue components are converted from the non-linear sRGB color space according to equation 8.17.

When values are written to the draw buffers, blit operations bypass most of the fragment pipeline. The only fragment operations which affect a blit are the pixel ownership test, the scissor test, and sRGB conversion (see section 17.3.7). Color, depth, and stencil masks (see section 17.4.2) are ignored

with section 17.3.7 "sRGB Conversion" stating:

If FRAMEBUFFER_SRGB is enabled and the value of FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the destination buffer is SRGB (see section 9.2.3), the R, G, and B values after blending are converted into the non-linear sRGB color space by [formula follows ...]

Note that section 17.3.7 will make clear that when writing to the color buffer, linear-to-sRGB conversion will only be applied if GL_FRAMEBUFFER_SRGB is enabled.

So this leaves us with the following possibilities:

  1. Both source and destination color buffer have a standard NON-sRGB format: The blit will copy the pixel values, no sRGB conversions can occur, no matter how GL_FRAMEBUFFER_SRGB is set.

  2. The source buffer has sRGB format, the destination has NON-sRGB format. Conversion from sRGB to linear will be done if, and only if GL_FRAMEBUFFER_SRGB is enabled.

  3. The source buffer has NON-sRGB format, the destination has sRGB format. Conversion from linear to sRGB will be done if, and only if GL_FRAMEBUFFER_SRGB is enabled.

  4. Both source and destination have sRGB formats. Conversion from sRGB to linear AND linear to sRGB will be done if, and only if GL_FRAMEBUFFER_SRGB is enabled. Note that the conversion might boil down to a no-op here. This includes cases with bilinear filtering, too: The GL spec does not require bilinear filtering on sRGB sources to be applied in linear space after the conversion, it may as well be applied before.

Now, the story could be over here. But it isn't. The behavior of framebuffer blits with sRGB formats has undergone a number of changes in the GL history.

Section 18.3.1 OpenGL 4.3 core profile specification does state:

When values are taken from the read buffer, if the value of FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the read buffer is SRGB (see section 9.2.3), the red, green, and blue components are converted from the non-linear sRGB color space according to equation

[Second paragraph is the same as the previous qoute from 4.6]

This means that up to GL 4.3, conversion of the source buffer was always done when it had an sRGB format, no matter the setting of GL_FRAMEBUFFER_SRGB enable. For the destination buffer, this setting still is relevant, though.

Now in OpenGL 3.3, there is no mention of the behavior on blitting with sRGB formats. The relevant section is 4.3.2 "Copying Pixels" and only states:

Blit operations bypass the fragment pipeline. The only fragment operations which affect a blit are the pixel ownership test and the scissor.

This means that it also will bypass linear-to-sRGB conversion for the destination buffer, and makes no statement about the source conversion at all.

Also not that drivers historically also have ignored the spec and did what they think is best when it comes to sRGB conversions. See for example the artice "March 2015 OpenGL drivers status and FB sRGB conversions". There is also this nice patch to the piglet OpenGL test suite talking about issues and presenting a somewhat sad conculsion:

I think the short summarys is: sRGB in OpenGL is just about as broken as it possibly can be. :( At least, every game developer I've ever talked to tells me so. Ugh.

However, in my experience, most current GL >= 4.4 drivers do the conversions as specified, so the situation isn't that bad any more. But I wouldn't bet my life on it, either.