I want to blend multisampled image on other multisampled image. Images have the same number of samples and the same sizes. I tried 2 approaches:
- multisampled texture. Fragment shader:
#version 450
layout (set = 1, binding = 0) uniform sampler2DMS textureSampler;
layout (location = 0) out vec4 outFragColor;
void main()
{
outFragColor = texelFetch(textureSampler, ivec2(gl_FragCoord.xy), gl_SampleID);
}
- input attachment. Fragment shader:
#version 450
layout(input_attachment_index = 0, set = 1, binding = 0) uniform subpassInputMS offscreenBuffer;
layout(location = 0) out vec4 color;
void main(void)
{
color = subpassLoad(offscreenBuffer, gl_SampleID);
}
I enabled sampleRateShading feature, set minSampleShading to 1.0f and sampleShadingEnable to VK_TRUE. The result is somewhat different, but it's still some rectangular garbage. In case of input attachment the form of the blended image looks like the original image but as if pixels of the original image were replaced with blocks of garbage.
I have tested my render passes and pipelines on non-multisampled images (but I use other shaders for non-multisampled images) and everything works fine.
What am I doing wrong?