roahmlab / sel_map

Semantic ELevation (SEL) map is a semantic Bayesian inferencing framework for real-time elevation mapping and terrain property estimation from RGB-D images.
https://roahmlab.github.io/sel_map/
MIT License
107 stars 12 forks source link

Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) #4

Closed sevenzero70 closed 1 year ago

sevenzero70 commented 1 year ago

Hello, it's my honor to learn about your work. when I put this project on my device, I find that there is something wrong about "roslaunch sel_map spot_sel.launch"

[ INFO] [1667551188.586010063]: Mesh Display: Update [ERROR] [1667551188.586152061]: Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) [MeshDisplay] onEnable--------------------------------> [ INFO] [1667551188.586368137]: Mesh Display: Update [ERROR] [1667551188.586400533]: Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) [ INFO] [1667551188.589839548]: No initial data available, waiting for callback to trigger ... [ INFO] [1667551188.589980674]: Mesh Display: Update [ERROR] [1667551188.590021492]: Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) [ INFO] [1667551188.590136463]: Mesh Display: Update [ERROR] [1667551188.590174442]: Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) [ INFO] [1667551188.592721792]: Mesh Display: Update [ERROR] [1667551188.592792062]: Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?)

after that, I check it out, and I think that, this problem is happened about "void MeshDisplay::initialServiceCall()", because:

ros::NodeHandle n; m_uuidClient = n.serviceClient<mesh_msgs::GetUUIDs>("get_uuid");

mesh_msgs::GetUUIDs srv_uuids; if (m_uuidClient.call(srv_uuids)) { std::vector<std::string> uuids = srv_uuids.response.uuids; if (uuids.size() > 0)

the value of "m_uuidClient.call(srv_uuids)" is none, so that, "[ INFO] [1667551188.589839548]: No initial data available, waiting for callback to trigger ..." is output on screen, but I can't find the reason, maybe it's related to ros package:mesh_tools?

and then, I really don't know how to continue, could you give me some advice, thanks a lot.

BuildingAtom commented 1 year ago

Hi sevenzero70, thanks for checking out our work!

Have you tried running any data through ROS? The initial mesh that is created for the map is treated as an empty mesh, so nothing will be sent to nor will show up inside of rviz until the map has data to process and the mesh begins to have valid points.

If you'd like to try with one of the rosbags we used for our paper, issue #3 includes the following link to spot_comp3.bag which corresponds to figure 5b in our paper along with some extra info. As liuxiao916 mentioned in that issue, you'll want to launch with spot_bag_sel.launch instead. We'll be updating our readme with this additional information eventually.

In case you are running data and nothing is showing, do you mind adding open_rviz:=false as another roslaunch argument and then opening rviz separately? Then could you send the output of the roslaunch window? In other words, in one terminal could you launch roslaunch sel_map spot_sel.launch open_rviz:=false and in another launch rviz, then share the output of the first terminal?

Also, I saw your request for access to one of the segmentation network colordict sheets we used for internal reference and have opened it up for general access. You should be able to see it now!

sevenzero70 commented 1 year ago

Thanks for your reply! I found that, the problem is my device performs so poorly that it takes a particularly long time to read the .bag message, and I set the frequency to 0.1 to build a complete elevation map barely. And then, I have some other questions:1、if I want to process some images from realsense, which topic should I provide to the node: sel_map. 2、when the bag has finished, I found that the robot is suspending in mid-air, maybe is lack of odometry information? Sorry to take up so much of your time, waiting for your answer, thanks!

------------------ 原始邮件 ------------------ 发件人: "roahmlab/sel_map" @.>; 发送时间: 2022年11月4日(星期五) 晚上10:19 @.>; @.**@.>; 主题: Re: [roahmlab/sel_map] Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) (Issue #4)

Hi sevenzero70, thanks for checking out our work!

Have you tried running any data through ROS? The initial mesh that is created for the map is treated as an empty mesh, so nothing will be sent to nor will show up inside of rviz until the map has data to process and the mesh begins to have valid points.

If you'd like to try with one of the rosbags we used for our paper, issue #3 includes the following link to spot_comp3.bag which corresponds to figure 5b in our paper along with some extra info. As liuxiao916 mentioned in that issue, you'll want to launch with spot_bag_sel.launch instead. We'll be updating our readme with this additional information eventually.

In case you are running data and nothing is showing, do you mind adding open_rviz:=false as another roslaunch argument and then opening rviz separately? Then could you send the output of the roslaunch window? In other words, in one terminal could you launch roslaunch sel_map spot_sel.launch open_rviz:=false and in another launch rviz, then share the output of the first terminal?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

BuildingAtom commented 1 year ago

Ah yes, our bags were recorded with the raw image data, so they do require a storage device with high bandwidth in order to keep processing at rate.

For any new device, you can specify a new robot configuration in sel_map/sel_map/config/robots/. In your new file, the image_rectified and depth_rectified tags refer to the aligned color and depth images to use, and the camera_info tag should be the topic that is relevant to both. You can then make your own custom launch file in sel_map/sel_map/launch/ to handle launch parameters and any supporting nodes you might want or need depending on the robot (e.g. image_transport's republish node or image_proc's depth registration node).

Hopefully this helps!

BuildingAtom commented 1 year ago

Due to inactivity, I'll be marking this issue as closed.

sevenzero70 commented 1 year ago

Sorry to bother you after a long time, I get part of the mapping effect when running “spot-comp8.bag” under the default configuration as shown in the figure, but the colors shown in the figure I can't correspond well to the type, I don't know the meaning of each color, whether it is the terrain category or the attribute value. Besides I want to get an image similar to the  second row of Figure 5 in your paper to represent the image of the coefficient of friction,and what should I do? Waiting for your reply, thanks.

------------------ 原始邮件 ------------------ 发件人: "roahmlab/sel_map" @.>; 发送时间: 2023年1月20日(星期五) 凌晨4:28 @.>; @.**@.>; 主题: Re: [roahmlab/sel_map] Mesh display: no visual available, can't draw mesh! (maybe no data has been received yet?) (Issue #4)

Due to inactivity, I'll be marking this issue as closed.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

BuildingAtom commented 1 year ago

Hi @sevenzero70, sorry for the delay. I would use spot_comp3.bag as that corresponds to a figure in our paper. spot_comp8.bag works, but it doesn't directly correspond to any figures.

When running with the default configuration, it'll use the colorscale defined in sel_map/sel_map/config/colorscales/default.yaml. With this default colorscale, unknown values are mapped to gray while values between 0 and 1 are mapped according to the 7-stop pink-purple colorscale. Lower values are pinker, while higher values are more purple. The measured terrain values can be found in sel_map/sel_map/config/terrain_properties/csail_semseg_properties.yaml. If you want to use the colors defined by the terrain_properties files, you can add colorscale:=use_properties.yaml to the launch command.

If you want an image similar to Figure 5 and your hardware may be running slower, you may want to consider replaying the rosbag with the extra --rate option: rosbag play spot_comp3.bag --clock --pause --rate 0.5. Since the node is designed to run in realtime, it will skip ahead in the processing of frames if it falls too far behind. Since the method relies on projected areas being processed multiple times as part of the Bayesian estimation, this probably won't yield reasonable results.

We pause our rosbag around at around the 50 second timestamp to get the mesh shown in figure 5b.