I have a partial mesh of a room that I am trying to texture with an image taken from a particular camera view of it. I am using pyrender
to do the rendering. If I load a mesh from an .obj
file that is already textured, the rendering shows up with the textures properly. However, when I take an untextured mesh and try to add the single image texture, it renders in a solid color.
Here's the code I have for taking the mesh vertices / faces and building the textured pyrender mesh object:
# Project Vertices into the Image to get UV coordinates
v_s, u_s = camera_projection(vertices, camera_pose, camera_K)
uvs = np.concatenate([u_s[valid_indices, None], v_s[valid_indices, None]], axis=1).astype(int)
# Build the Trimesh texture and Mesh
# image is a (1080, 1080, 3) np.uint8 matrix
material = trimesh.visual.texture.SimpleMaterial(image=image)
color_visuals = trimesh.visual.TextureVisuals(uv=uvs, image=image, material=material)
assert(len(uvs) == len(vertices))
mesh=trimesh.Trimesh(vertices=vertices, faces=faces, visual=color_visuals, validate=True, process=True)
# Convert to trimesh
mesh = pyrender.Mesh.from_trimesh(mesh, smooth=True)
The rest of the rendering pipeline works properly if given a mesh that is textured in the .obj
file. I know for fact that all of the vertices have uv
coordinates that are within the image boundaries, but not all of the image is used (since the mesh has some gaps). Any Ideas on what might be going wrong here, or any suggestions for how to go about texturing the mesh using another library?
Thanks!