sampler value of texture from DisparityFloat16 pixel format on iOS OpenGLES

269 views Asked by At

I want to use depthDataMap as a texture from iPhoneX true depth camera on my OpenGLES project. Have downloaded some Swift samples, it seems that depthMap can be created and sampled as a float texture on Metal. But on OpenGLES, the only way to create a depth texture from depth buffer is,

glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, depthWidth, depthHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_SHORT, CVPixelBufferGetBaseAddress(depthBuffer));

The sample value is different from the value exported as CIImage from DisparityFloat16 pixel type. The value is much lower, and not a linear scale compared to the CIImage.

This is sampled value in OpenGLES enter image description here

This is via code: CIImage *image = [CIImage imageWithCVImageBuffer:depthData.depthDataMap];

enter image description here

Does anyone have the same issue?

1

There are 1 answers

0
Michael On

Well it looks like you're specifying the pixel data type as GL_UNSIGNED_SHORT, try changing it to GL_HALF_FLOAT (if using DisparityFloat16) or GL_FLOAT (if using DisparityFloat32).

Also, if you want to display the depth buffer as a texture, you should be converting the depth data to values that mean something in a grayscale image. If you normalize your depth buffer values to be integers between 0 and 255, your picture will look a whole lot better.

For more information, Apple has examples of this exact thing. They use Metal, but the principal would work with OpenGL too. Here's a really nice tutorial with some sample code that does this as well.