The AlphaRef is set with D3DRS_ALPHAREF, which is an integer ranging from 0x00000000 to 0x000000FF. However, AlphaValue computed by shader is a float value in range [0.0, 1.0]. So what is the expected comaparison of the two value, since them have different representing precision? Does the float AlphaValue need to do extra rounding operation?
What is expected comparison between AlphaRef and AlphaValue in D3D9?
71 views Asked by wells hong At
1
In Direct3D 9 shaders, all input data is converted to floating-point and then converted as needed on output.