I am using the PX4 toolchain to build an autonomous UAV. I am using it through the Gazebo simulator, using ROS and MAVROS. For now, I need to work on path finding algorithms, not sensors, so I would like a "ground truth" map of a Gazebo simulation. If possible it would be great to be able to use it through the Octomap node of ROS.
All I have been able to do, is to connect a depth camera point cloud to the ROS octomap node, and look at it with RViz, like explained in this tutorial. I thought about putting depth camera sensor all around my world, in order to approximate the ground truth world with the octomap representation, but it looks complicated to do and no very efficient.
So is there a way to generate a point cloud representation of the current Gazebo world simulation, and connect it to an Octomap ?
Thank you very much for your help ! :)
I am posting my solution since it might happen to someone else.
In the end, I made a script to convert a Gazebo world to an octomap .bt file. It mainly uses this reference. First, I convert the .world, .sdf and .dae files into a .dae file. This is done in python script, by parsing the XML of .world and .sdf files and by using the pycollada library to parse .dae files. Then, I use Blender to convert the .dae scene in a .obj. Then, using the program in the link, I convert the .obj in a voxel representation .binvox, and finally to an octomap .bt file.
My code : https://github.com/rhidra/autopilot/tree/master/world_to_octomap
My code, especially the python script for .world and .sdf parsing, is very rudimentary. It is probably too simple for a complex problem.
I hope it will help someone !