I'm currently working on a project, where I'm coloring 3d objects. The code is similar to here. Basically, I use an orthogonal camera that renders to a rendertexture, which is used as UV image in my material. In the beginning, it only "sees" a plane with a base image on it.
On painting, I create a 2d sprite at the correct position, in front of this render cam. This works fine.
Now, as Unity does not run smoothly with a lot of sprites, I try to reduce the load by combining the sprites into the base image with the Texture2D.ReadPixels()
command. This new texture is then set as the new base texture. Rinse and repeat.
This approach works flawlessly if I'm using the build-in Unity camera settings. As the project is intended for VR, I ported everything into a SteamVR project. The runtime rendering is fine as well, only the save-as-png texture function starts to break. I included the links to the different images. The first is the Base Image, second the saved with "Unity only", last the saved with SteamVR textures. I only paint green circles as sprites upon them.
As you can see, the SteamVR has several problems in comparison with the unity only version:
- The alpha channel is detected as black, there is no fading
- The overall resolution becomes.. unstable? (see the numbers, they are getting blockier)
- The colors from block 1, 5 and 6 are wrong
I tried different formats for the renderTexture, UV texture (RGBA, ARGB, RGB). Apart from importing the SteamVR, nothing else has changed. What am I missing?
The rendering code is as follow:
Texture2D tex = new Texture2D(width, height, TextureFormat.RGB24, false);
tex.ReadPixels(new Rect(0, 0, width, height), 0, 0);
tex.Apply();
plane.GetComponent<MeshRenderer>().material.mainTexture = tex;
Saving with
var bytes = tex.EncodeToPNG();
System.IO.File.WriteAllBytes(fullPath + fileName, bytes);
If necessary, I can upload the complete project. Thanks for your help
Thomas