Android Encode h264 using libavcodec for ARGB

1.7k views Asked by At

I have a stream of buffer content which actually contains 480x800 sized ARGB image[byte array of size 480*800*4]. i want to encode around 10,000s of similar images into a stream of h.264 at specified fps(12). this shows how to encode images into encoded video,but requires input to be yuv420.

Now i have ARGB images, i want to encode into CODEC_ID_H264 How to convert RGB from YUV420p for ffmpeg encoder? shows how to do it for rgb24, but how to do it for rgb32,meaning ARGB image data

how do i use libavcodec for this?

EDIT: i found How to convert RGB from YUV420p for ffmpeg encoder? But i don't understand.

From the 1st link, i come to know that AVFrame struct contains data[0],data1,data[2] which are filled with Y, U & V values.

In 2nd link, they showed how to use sws_scale to convert RGB24 to YUV420 as such

SwsContext * ctx = sws_getContext(imgWidth, imgHeight,
                              AV_PIX_FMT_RGB24, imgWidth, imgHeight,
                              AV_PIX_FMT_YUV420P, 0, 0, 0, 0);
uint8_t * inData[1] = { rgb24Data }; // RGB24 have one plane
int inLinesize[1] = { 3*imgWidth }; // RGB stride
sws_scale(ctx, inData, inLinesize, 0, imgHeight, dst_picture.data, dst_picture.linesize)

Here i assume that rgb24Data is the buffer containing RGB24 image bytes.

So how i use this information for ARGB, which is 32 bit? Do i need manually to strip-off the alpha channel or any other work around ?

Thank you

1

There are 1 answers

10
szatmary On

Just switch out the pixelformat and line stride from RGB24 to ARGB

SwsContext * ctx = sws_getContext(imgWidth, imgHeight,
                              AV_PIX_FMT_ARGB, imgWidth, imgHeight,
                              AV_PIX_FMT_YUV420P, 0, 0, 0, 0);
uint8_t * inData[1] = { rgb24Data }; // RGB24 have one plane
int inLinesize[1] = { 4*imgWidth }; // RGB stride
sws_scale(ctx, inData, inLinesize, 0, imgHeight, dst_picture.data, dst_picture.linesize)