Note that it takes almost double the CPU as running a 3D lidar driver publishing out 3D data at 10hz and half as much as the realsense doign 30fps on huge point clouds
This seems excessive for what it does. I suspect this can be massively reduced if implemented in C++
Please provide the following information:
Real hardware or simulation: real robot
Expected behaviour
Diagnostics take 1-2% CPU, not 22% steady state
Actual behaviour
Note that it takes almost double the CPU as running a 3D lidar driver publishing out 3D data at 10hz and half as much as the realsense doign 30fps on huge point clouds
This seems excessive for what it does. I suspect this can be massively reduced if implemented in C++