Is there any simple way (tool, script) to, having 2 images of exactly same scene in different lighting conditions, normalize them so they would appear like they would be made in same conditions?
I have read about some histograms and how this normalization should work, but can't really find any real (non theoretical) solution.
There's plenty of implementations out there, but the really good ones are patented, and under lock-n-key by companies like Fuji, Nikon, Polaroid, & etc.
Your going to have to do the math, and/or mimic other opensource works.
PHP's Imagick offers methods like Imagick::contrastStretchImage, and Imagick::linearStretchImage to align, trim, and shift histograms. If you really are attempting to adjust the lumanuis, Imagick::colorMatrixImage can be used to rebuild/alter an image's color-balance. Take a look at Fred's whitebalance script for example.