glReadPixels always returns a black image

1.1k views Asked by At

Some times ago I wrote a code to draw an OpenGL scene to a bitmap in Delphi RAD Studio XE7, that worked well. This code draw and finalize a scene, then get the pixels using the glReadPixels function. I recently tried to compile the exactly same code on Lazarus, however I get only a black image.

Here is the code

// create main render buffer
glGenFramebuffers(1, @m_OverlayFrameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, m_OverlayFrameBuffer);

// create and link color buffer to render to
glGenRenderbuffers(1, @m_OverlayRenderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, m_OverlayRenderBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA8, width, height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,
                          GL_COLOR_ATTACHMENT0,
                          GL_RENDERBUFFER,
                          m_OverlayRenderBuffer);

// create and link depth buffer to use
glGenRenderbuffers(1, @m_OverlayDepthBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, m_OverlayDepthBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, width, height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,
                          GL_DEPTH_ATTACHMENT,
                          GL_RENDERBUFFER,
                          m_OverlayDepthBuffer);

// check if render buffers were created correctly and return result
Result := (glCheckFramebufferStatus(GL_FRAMEBUFFER) = GL_FRAMEBUFFER_COMPLETE);

...

// flush OpenGL
glFinish;
glPixelStorei(GL_PACK_ALIGNMENT,   4);
glPixelStorei(GL_PACK_ROW_LENGTH,  0);
glPixelStorei(GL_PACK_SKIP_ROWS,   0);
glPixelStorei(GL_PACK_SKIP_PIXELS, 0);

// create pixels buffer
SetLength(pixels, (m_pOwner.ClientWidth * m_Factor) * (m_pOwner.ClientHeight * m_Factor) * 4);

// is alpha blending or antialiasing enabled?
if (m_Transparent or (m_Factor <> 1)) then
    // notify that pixels will be read from color buffer
    glReadBuffer(GL_COLOR_ATTACHMENT0);

// copy scene from OpenGL to pixels buffer
glReadPixels(0,
             0,
             m_pOwner.ClientWidth  * m_Factor,
             m_pOwner.ClientHeight * m_Factor,
             GL_RGBA,
             GL_UNSIGNED_BYTE,
             pixels);

As I already verified and I'm 100% sure that something is really drawn on my scene (also on the Lazarus side), the GLext is well initialized and the framebuffer is correctly built (the condition glCheckFramebufferStatus(GL_FRAMEBUFFER) = GL_FRAMEBUFFER_COMPLETE returns effectively true), I would be very grateful if someone could point me what I'm doing wrong in my code, knowing one more time that on the same computer it works well in Delphi RAD Studio XE7 but not in Lazarus.

Regards

0

There are 0 answers