Why does the size of raw_image equal the sensor size?

76 views Asked by At

I would like to access the intensity of individual color pixels within the RGGB-Bayer pattern of my Sony camera. With the rawpy package it seems like I can obtain the raw_image, but it is a 2D-array of the shape (4024, 6048), which is the size of the sensor. Shouldn't the array contain RGGB-data for each pixel, and therefore have the shape (8048, 12096)?

import rawpy

im_raw = rawpy.imread('Test.ARW').raw_image
print(np.shape(im_raw))
2

There are 2 answers

0
jsbueno On

The RawPy object created by rawpy.imread has, in parallel to the pixel values in the .raw_image array you are looking at, another .raw_colors attribute: a second array with a number corresponding to each pixel - in the case of your sample image, they go from 0 to 3 - I assume each of these numbers is a color channel.

Maybe the other attributes of the RawPy object have more information about how to map these indices to color channels - otherwise, just checking the libraw or Sony resources for information on this raw format, or, otherwise experimenting with the channels.

As for the sensor size X image pixel size: it is just that - you have this color info for this subpixel, and Sony, pyraw and other software have the means to build an RGB image out of that. (call the .postprocess() method on the RawPy object to get a uint8 4k x 6k RGB image)

1
Ture Pålsson On

Each pixel of a bayer-pattern sensor is either red, green or blue, and there are as many greens as there are reds and blues together. There is not full colour info for every pixel. To make a full RGB image, the three components need to be interpolated to fill out the gaps.