I use a Rust library to parse raw ARW images (Sony Raw Format). I get a raw buffer of 16 bit pixels, it gives me the CFA (Color Filter Array) (which is RGGB), and the data buffer contains height * width pixels in bayer encoding. Each pixel is stored as 16 bit (however, I think the camera only uses 12 or 14 of the 16 bits for each pixel).

I'm using a Bayer library for the demosaicing process. Currently, my final image is too dark and has a greenish cast after the demosaic process. I guess the error is that before I pass the data to the bayer library, I try to transform each 16 bit value to 8 bit by dividing it by u16::max and multiplying it with u8::max. However, I don't know if this is the right approach.

I guess I need to perform additional steps between the parsing of the raw file and passing it to the bayer library. May I have any advice, please?

I can ensure that at least some demosaicing works. Here's a screenshot of the resulting image:

demosaicing attempt (dark and too much green)

Current Code

The libraries I'm using are rawloader and bayer

let decoded_raw = rawloader::decode_file(path).unwrap();
let decoded_image_u16 = match &decoded_raw.data {
    RawImageData::Integer(data) => data,
    RawImageData::Float(_) => panic!("not supported yet"),
};

// u16 to u8 (this is probably wrong)
let mut decoded_image_u8 = decoded_image_u16
    .iter()
    .map(|val| {
        // todo find out how to interpret the u16!
        let val_f32 = *val as f32;
        let u16_max_f32 = u16::MAX as f32;
        let u8_max_f32 = u8::MAX as f32;
        (val_f32 / u16_max_f32 * u8_max_f32) as u8
    })
    .collect::<Vec<u8>>();

// prepare final RGB buffer
let bytes_per_pixel = 3; // RGB
let mut demosaic_buf = vec![0; bytes_per_pixel * decoded_raw.width * decoded_raw.height];
let mut dst = bayer::RasterMut::new(
    decoded_raw.width,
    decoded_raw.height,
    bayer::RasterDepth::Depth8,
    &mut demosaic_buf,
);

// DEMOSAIC
// adapter so that `bayer::run_demosaic` can read from the Vec
let mut decoded_image_u8 = ReadableByteSlice::new(decoded_image_u8.as_slice());

bayer::run_demosaic(
    &mut decoded_image_u8,
    bayer::BayerDepth::Depth8,
    // RGGB is definitely right for my AWR file
    bayer::CFA::RGGB,
    bayer::Demosaic::Linear,
    &mut dst,
)
.unwrap();
2

There are 2 answers

0
Finomnis On

I'm not sure if this is connected to the actual problem, but your conversion is way overkill.

To convert from the full range of a u16 to the full range of a u8, use:

(x >> 8) as u8
fn main() {
    let convert = |x: u16| (x >> 8) as u8;

    println!("{} -> {}", 0, convert(0));
    println!("{} -> {}", 30000, convert(30000));
    println!("{} -> {}", u16::MAX, convert(u16::MAX));
}
0 -> 0
30000 -> 117
65535 -> 255

I might be able to help you further if you post the input image, but without being able to reproduce your problem I don't think there will be much else here.

0
Amacs On

Your image is actually the raw data, with each recorded pixel luminosity. The shade is dark greenish because the sensor has a 4 subpixel RGGB arrangement, which means green is overrepresented. During interpolation, one of the green pixel groups assists in determining brightness, resulting in the image produced from RAW data being significantly brighter. Try RawTherapee, it has a deinterpolation function, and you can choose the raw image, as the sensor pattern look like.