Open elpimous opened 4 years ago
I have recently been investigating a similar problem, here is what I know so far: The database is internal to the T265, though it is possible to export it and later import it. However, all examples I have found interface directly with librealsense; I am not aware of a way to do that using the ROS wrapper.
Thanks. So it must work like the oculus quest (keep in memory all guardians (perimeters).
Could Intel staff tell us how to access this BDD ?
I have not tried it myself, but here is a code sample showing how to export and import the map.
Thanks, i'll invertigate it
well, we can save and load the raw map on realsense_viewer (and command line, I think)
Any way to load it under ROS ??
Hello @elpimous
Any progress? I too, am using the T265 and D435 but on the jetson Nano. I implemented the occupancy-mapping branch of realsense-ros but it looks just like they simply use the T265 as an IMU for odometry readings and construct the 2D occupancy map using laser scan data from the D435 (please correct me if I'm wrong). It doesn't use the online SLAM capabilities of the T265.
I was under the impression that T265's SLAM capabilities could be directly visualized on ROS since T265 has been marketed as an off-the-shelf SLAM device, having support for ROS. @RealSenseCustomerSupport would you be able to provide any support regarding this?
Hi, @arjunsengupta98, D435 is used to create 2d map, with fake lazerscan. D435, after, is only used to see obstacles, avoid collisions, and help in placement, with lazerscan
T265 mix it imu, with stereo fisheyes cams, to find markers in pictures.
When 2d occupancy map is done with t265(and only it), database stays in it memory, but can be accessed with command line.
So, under lines, T265 is only an imu Lol, but an imu that can replace the robot in it real place, via it internal database. I must say that it's very good for positioning
D435 does all others parts..
Thanks for the response @elpimous Any idea about how that internal map can be visualized/ streamed?
If it is possible through the command line, will you be able to provide any links/reference that demonstrates how this is done?
Any update on it ? How can we re localize a robot after for example a power loss ?
According to Intel's claim
A good system relocalizes with centimeter absolute accuracy and has the option to relocalize frequently and on a predefined cadence. Another very important aspect of feature map generation, is whether the maps can be shared across multiple agents. For example, it would be desirable to have that any robot or autonomous vehicle could benefit from the fact that another robot has already mapped out that area. For this to happen, each robot must be able to not only capture, but also export and share their maps, and more importantly, the detected features should look the same on each robot. The T265 allows for this cooperative mapping.
But I havnt found any example/demonstration for it. Kindly provide some insight
I have implemented map exporting and importing, with successful relocalization, in a local branch of the ROS realsense code. I will try to make it into a public fork soon.
Hi @drjsmith, I was just curious what level of localization were you able to achieve. I tried librealsense C++ interface to save and retrieve the map along with landmarks. What I got was the position of my robot relative to the landmark instead of global position wrt the map. Were you able to get global map centric pose after relocalization ?
I created a fork with my changes here.
If the 'map_out
parameter is specified, the onboard map will be exported to that location when the node exits. If the map_in
parameter is specified, an existing raw map will be loaded from that location onto the device.
The map_frame_id
parameter determines the name of the global mapping frame. This frame is represented by a landmark added to the onboard map during the first mapping run (eg. when no prior map was loaded). Querying the camera for the relative pose of the landmark provides the necessary information to publish the transform from the map frame to the realsense odometry frame.
You can then use tf/tf2 as usual to transform a pose from any frame to the map frame.
I hope you find this code useful, feel free to add an issue if you find any problems or have feedback. I'm not certain how much time I will be devoting to this in the future, but I'll see what I can do.
Use roslaunch realsense2_camera rs_t265.launch
to start the node, specifying the args map_out
, map_in
, and map_frame_id
as relevant.
For example, the first time you might run roslaunch realsense2_camera rs_t265.launch map_out=/tmp/map1
, the next time
roslaunch realsense2_camera rs_t265.launch map_in=/tmp/map1 map_out=/tmp/map2
, etc.
@drjsmith Thanks alot for your package. I am unable to understand the relative transformations between the static nodes and the camera after localization. Can you please specify the spatial relationship between transformations. After successful relocalization is the camera pose provided relative to that node ?
@zainmehdi The realsense camera starts a new SLAM session every time and reports its pose wrt where it started (more or less). Published poses are in this realsense odometry frame. Relocalization does not change the frame poses are published in, but gives the necessary information to calculate a transform to the frame map_frame_id
. The camera pose is not provided directly in the frame 'map_frame_id', but any node that needs to know the pose in map_frame_id
can get it using tf.
Hi @drjsmith, if the robot is kidnapped, how could retrieve its location ? How could it query and find the landmark if this landmark isn't in its field of view ?
@germal It is a virtual landmark, also referred to as a static node in the realsense api, and doesn't actually affect the camera's internal process of tracking and localizing. A static node is added with the set_static_node
function and is simply a named pose, where the pose is provided relative to the origin of current coordinate system of device poses
. The get_static_node
function gets the pose of the static node with the given name relative to the current origin of coordinates of device poses. Thus, poses of static nodes of an imported map are consistent with current device poses after relocalization.
So long as the camera is able to localize wrt the imported map, the transform between the tracking frame of the current session and that of the original session can found by querying the camera for the pose of the static node that was placed at the origin of the tracking frame during that original session.
@drjsmith Thank you for your clarifications . I understand that you always refer to "map" as the internal map of the t265, so the static node you set is like an anchor for next relicalizations using exclusively the t265. But in case of the t265+d435 occupancy mapping context, inside the resulting occupancy map, do we lose the information of the static node or there is a way to use it ?
@germal I'm not entirely sure what you mean by lose the information of the static node. I have not used a t265 concurrently with a d435 for that purpose, but here's how I would envision it working:
Here are my assumptions:
tf
.Costmap2dROS
and octomap_ros
are capable of doing this.map
frame.map_frame_id
set to map
as well.Operation would be as follows:
map
frame. The transform from the d435 to the t265 is static and therefore always available, so all that is needed is the transform from the t265 to map
. If the transform is available, the data is added to the occupancy map.base_frame
to its odom_frame
.odom_frame
to the map
frame.An occupancy grid created during the first run will therefore be usable during subsequent runs so long as relocalization is achieved by the t265. If desired, the t265's updated internal map can be exported after each run. Similarly, the occupancy grid could be kept static after the first run or updated and exported each time.
Hope this helps. I intend to do something like this soon, though I'm not sure exactly when.
@drjsmith thank you,it helps a lot ! With "lose the information" I was asking if the static node pose was usable also in the context of the occupancy mapping.From your detailed reply now I see that using tf transformation between occupancy map and t265 odometry we can always use that information.
Hey @drjsmith Thanks for the code! Does it work with the latest version of librealsense (2.37.0)? I have the latest version, and I implemented your code on the Jetson Nano.
For some reason, each time the Reference frame gets initialized (0,0,0), the realsense manager seems to be shutting down.
I simply run the following:
roslaunch realsense2_camera rs_t265.launch
After the realsense node is initialized, the terminals outputs the messages:
[ WARN] [1599564253.080993695]: Warning, unable to set static node for guid [map] [ WARN] [1599564254.080783694]: Warning, unable to set static node for guid [map] [ WARN] [1599564255.080590259]: Warning, unable to set static node for guid [map] [ WARN] [1599564256.080600527]: Warning, unable to set static node for guid [map]
I move the camera around a bit, so that the tracker_confidence parameter reaches '3' Once this happens, the terminal shows the following:
[ INFO] [1599564257.080754966]: Reference frame initialized: 0, 0, 0 [camera/realsense2_camera_manager-2] process has died [pid 22018, exit code -11, cmd /opt/ros/melodic/lib/nodelet/nodelet manager name:=realsense2_camera_manager log:=/home/jetbot/.ros/log/cd82c668-f1c5-11ea-acd1-d037451c04c2/camera-realsense2_camera_manager-2.log]. log file: /home/jetbot/.ros/log/cd82c668-f1c5-11ea-acd1-d037451c04c2/camera-realsense2_camera_manager-2.log [camera/realsense2_camera-3] process has finished cleanly log file: /home/jetbot/.ros/log/cd82c668-f1c5-11ea-acd1-d037451c04c2/camera-realsense2_camera-3.log
Any idea what the issue may be?
@arjunsengupta98 I don't know, I have only tested the code with apt version 2.7.7-0~realsense0.29
and then only on a Core i7 equipped laptop. It sounds like it is able to add the static node just fine but potentially failing when trying to retrieve it. Best way to troubleshoot is generally to compile in debug mode and run with gdb to get a stack trace when it crashes. Does the original, unmodified t265 code run without issues? You could try temporarily disabling the localizing behavior by commenting out line 265 or 229. Without more information, hard to say what's going on.
Thanks a lot for the suggestion @drjsmith Sorry for the late response. The unmodified T265 code runs without any issues. The code does not crash on commenting out the lines that you mentioned. Yes, it looks like it is failing while trying to retrieve the static node. The log file merely shows an error message stating: "Bond broken, exiting".
Still not sure what the issue is.
@drjsmith did you create a PR to merge your branch to Intel's realsense-ros repository? I feel like this is the kind of thing that could benefit a big community, specially given the fact that Intel does not plan to do any more development on the T265 =)
@drjsmith I created a branch/PR based on your implementation: https://github.com/IntelRealSense/realsense-ros/pull/1981
@marcelino-pensa I'm glad my implementation was helpful. I'll see if I can create a followup PR to merge in the code that publishes the transform for the original map's coordinate frame.
@drjsmith Thank you! I would hold a while on a follow up PR, as I'm not sure whether Intel still has any interests in merging PRs related to the T265... We can plan moving forward in case the above PR gets merged
@RealSenseCustomerSupport @RealSenseSupport Is there any chance that the PR will be merged, or will it linger due to the T265 not being supported?
Hello all, working on melodic, jetson TX2. D435 and T265.
I previously worked on Rtabmap, with pointclouds, close loops, and kidnapped robot on D435.
Now, working on VIO with D435 and T265, on Cartographer-ros.
T265 is sold as kidnapped robot activated.
A kidnapped robot process means that a rebooted robot on an other place, must find it position and replace itself on the map. (rtabmap use for that, a database)
Regarding to intel T265 camera, where is the T265 Database ???
Could anyone inform me ?
Thanks all, best regards.
Vincent FOUCAULT (elpimous-robot)