I am receiving a video frame in a GstVideoFrame structure of a gstreamer element which is written in C++.
The frame is in YUV420 NV12 format
In this gstreamer element, I am trying to copy y-frame and uv-frame in separate buffers.
According to videolan.org, the YUV420 NV12 data is stored as following in the incoming frame buffer : (info copied from website)
NV12:
Related to I420, NV12 has one luma "luminance" plane Y and one plane with U and V values interleaved.
In NV12, chroma planes (blue and red) are subsampled in both the horizontal and vertical dimensions by a factor of 2.
For a 2×2 group of pixels, you have 4 Y samples and 1 U and 1 V sample.
It can be helpful to think of NV12 as I420 with the U and V planes interleaved.
Here is a graphical representation of NV12. Each letter represents one bit:
For 1 NV12 pixel: YYYYYYYY UVUV
For a 2-pixel NV12 frame: YYYYYYYYYYYYYYYY UVUVUVUV
For a 50-pixel NV12 frame: Y×8×50 (UV)×2×50
For a n-pixel NV12 frame: Y×8×n (UV)×2×n
but I cant seem to calculate the offset of y-data and uv-data in the buffer.
Update_1 : I have height and width of the frame to calculate the size of y-data and uv-data as:
y_size = width * height;
uv_size = y_size / 2;
any help or comments regarding this will be appreciated.
thanks
Thanks to @Ext3h, this is how I was able to separate y-data and uv-data from incoming YUV frame.