I am currently applying the Contrast Limited Adaptive Histogram Equalization algorithm together with an algorithm to perform the photo denoise.
My problem is that I am working with 360 photos. As the contrast generates different values at the edges when I join the photo, the edge line is highly noticeable. How can I mitigate that line? What changes should I make so that it is not noticeable and the algorithm is applied consistently?
Original Photo:
Code to Contrast Limited Adaptive Histogram Equalization
# CLAHE (Contrast Limited Adaptive Histogram Equalization)
clahe = cv2.createCLAHE(clipLimit=1., tileGridSize=(6, 6))
lab = cv2.cvtColor(image, cv2.COLOR_BGR2LAB) # convert from BGR to LAB color space
l, a, b = cv2.split(lab) # split on 3 different channels
l2 = clahe.apply(l) # apply CLAHE to the L-channel
lab = cv2.merge((l2, a, b)) # merge channels
img2 = cv2.cvtColor(lab, cv2.COLOR_LAB2BGR) # convert from LAB to BGR
Result:
360 performed:
It is highly notorious line of separation because it is not taken into account that the photo is joined later. What can I do?
Here's an answer for C++, you can probably convert it easily to python/numpy. The idea is to use a border region before performing CLAHE and crop the image afterwards.
These are the subimage regions in the original image:
and they will be copied the the left/right of the image like this:
Maybe you can reduce the size of the border strongly:
Images: input as in your original post.
After creating the border:
After performing CLAHE (with border):
After cropping the CLAHE-border-image: