On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.
Is such a thing possible on iOS?
On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.
Is such a thing possible on iOS?
The output you get from the camera in iOS is a
CMSampleBufferRef
, with aCVPixelBufferRef
inside. (See documentation here). iOS from version 5 hasCVOpenGLESTextureCache
in the CoreVideo framework, which allows you to create an OpenGL ES texture using aCVPixelBufferRef
, avoiding any copies.Check the RosyWriter sample in Apple's developer website, it's all there.