Greyscale image in SDL2

3.7k views Asked by At

I have an array of uint8_t which represents a greyscale picture, where each pixel is one uint8_t. I would like to display this in a window using the SDL2 library.

I have tried to create an SDL_Surface from the array by doing

mSurface = SDL_CreateRGBSurfaceFrom(mData, mWidth, mHeight, 8, mWidth, 0xFF0000, 0xFF0000, 0xFF0000, 0xFF0000);

However, the problem is that when a depth of 8 bits is passed to SDL_CreateRGBSurfaceFrom (as I have done here), according to the SDL2 wiki "If depth is 4 or 8 bits, an empty palette is allocated for the surface" . If it wasn't for that, then I would be able to tell SDL that each pixel is one byte, and to use that byte for the R, G, and B values.

I want a depth of 8 bits per pixel because thats how my data is stored, but I don't want to use a pallete.

Is there any way to make SDL not assume I want a pallete, and just display the image with the r, g, and b masks all set to that byte?

I understand that an alternative solution would be to convert my greyscale image into RGB by copying each byte three times, and then to display it. However, I would like to avoid doing that if possible because all that copying would be slow.

1

There are 1 answers

0
Jonny D On BEST ANSWER

SDL_CreateRGBSurfaceFrom() does not handle 8-bit true color formats. As you noted, it creates a blank palette for 8-bit depths. The most obvious thing to do is to fill in the palette and just let it do its thing.

Here's some code for a grayscale palette:

SDL_Color colors[256];
int i;

for(i = 0; i < 256; i++)
{
    colors[i].r = colors[i].g = colors[i].b = i;
}

SDL_SetPaletteColors(mSurface->format->palette, colors, 0, 256);

Also, a rule of thumb: Never avoid something that works just for being "slow". Do avoid things that are "too slow". You might only know when something is "too slow" by trying it out.

In this case, you might only be loading this image once and then after that you experience a negligible performance effect.