I want to write some shader (surface/fragment/...) to recolor my diffuse texture in some new color. Currently I have this version of shader (I'm trying to recolor texture in real time):
//sm_surf
uniform vec4 colorTarget; (used in full version)
uniform vec4 colorTint; (used in full version)
vec4 colorTexture = texture2D(u_diffuseTexture, _surface.diffuseTexcoord);
//vec4 colorTexture = _sample.diffuse;//the same result
vec4 tinted = colorTexture;
tinted.r = max(0.0, colorTexture.r - 0.2);
_surface.diffuse = tinted;
And this is the code in OpenCV (I just recolored texture beforehand and used it as new diffuse texture):
image = cv::imread([path UTF8String], cv::IMREAD_UNCHANGED);
for (int i = 0; i < image.rows; i++) {
for (int j = 0; j < image.cols; j++) {
cv::Vec4b pixel = image.at<cv::Vec4b>(i,j);
pixel[2] = fmin(255, fmax(0, pixel[2] - 50));
image.at<cv::Vec4b>(i,j) = pixel;
}
}
cv::imwrite([newPath UTF8String], image);
For this test I just want to reduce R component of color. Results:
OpenCV (correct)
SceneKit (incorrect)
diffuse texture contains alpha channel.
(SOLVED by mnuages) Also, seems like after recoloring shader alpha channel is broken. With this shader:
tinted.r = 1;
tinted.g = 0;
tinted.b = 0;
How can I just recolor diffuse texture like in openCV?
UPDATE: This are result for SceneKit shader and OpenCV (I have removed all transparent pixels from image):
shader:
vec4 colorTexture = _surface.diffuse;
vec3 tinted = colorTexture.a > 0.0 ? colorTexture.rgb / colorTexture.a : colorTexture.rgb;
if (colorTexture.a == 1) {
tinted.r = max(0.0, colorTexture.r - 0.2);
} else {
colorTexture.a = 0;
}
_surface.diffuse = vec4(tinted, 1.0) * colorTexture.a;
and OpenCV code:
pixel[2] = fmax(0, pixel[2] - 50);//2 because it's bgr in OpenCV
if (pixel[3] != 255) {
pixel[3] = 0;
}
Some more strange things: I have changed my OpenCV code to this to generate new texture
pixel[0] = 255 - (j % 4) * 30;//b
pixel[1] = 0;//g
pixel[2] = 0;//r
pixel[3] = 255;
If I change this texture like this:
if (pixel[0] == 255) {
pixel[0] = 255;pixel[1] = 255;pixel[2] = 255;
} else {
pixel[0] = 0;pixel[1] = 0;pixel[2] = 0;
}
I receive smth like this:
With this SceneKit shader it should be the same:
vec4 colorTexture = _surface.diffuse;
vec3 tinted = colorTexture.rgb; // colorTexture.a == 1
if (tinted.b > 0.99) {
tinted = vec3(0,0,0);
} else {
tinted = vec3(1,1,1);
}
_surface.diffuse = vec4(tinted, 1.0) * colorTexture.a;
But I receive this:
There are some white stripes, but way too thin.
I can increase them by changing condition to tinted.b > 0.85, but it's already an error, because color in _surface.diffuse not the same as in texture. Seems like SceneKit interpolates the texture or smth like that.
UPDATE2:
I have added source code (1.5Mb) with this problem. There are 3 spheres:
1 Top) With original texture
2 Left) With texture recolored by shader (newR = r - 0.2) (float)
3 Right) With texture recolored by OpenCV (newR = r - 51) (uint8)
and they are different! Scene doesn't contain any light/env/... just 3 spheres.
SceneKit uses premultiplied alpha. The following should work:
Edit
This code works well in the example you attached, but requires some changes for the two techniques to match 100%. As explained in this other SO thread, SceneKit shaders operate in a linear color space.
This means that
51
in your CPU code doesn't map to0.2
in your GPU code. To get the same result you will need to convert the sampled color (linear) to a tinted color (non-linear), apply the tint operation, and then convert back to linear.As for the example with stripes, this is the expected behaviour. Mipmapping will lead to grayscale values. If you move close enough to the object so that 1 texel is projected onto ~1 pixel on screen, then you'll only get black and white values again.