Intermodalics / tango_ros

ROS related software for Tango
Apache License 2.0
67 stars 22 forks source link

ArCore Support now that Tango is dead? #378

Open eric-schleicher opened 6 years ago

eric-schleicher commented 6 years ago

For my purposes, one of the most useful aspects of the tango device is using it to integrate a device pose into ROS. Curiously, I don't (currently) use the tango depth sensor as I'm using a much more powerful RGBD.

With the death of Tango & resulting advent of ArCore, is there any thought about where to take this project / add support for ArCore? On the surface it looks to radically open up a lot of opportunities for people to write ROS powered consumer applications.

Somewhat selfishly, it would be awesome to use son-of-tango-streamer on the ArCore supporting phones as a device pose appliance for my ROS based projects. I'm willing to go do that myself, but it seemed that this was the right place to surface the question.

Thoughts?

smits commented 6 years ago

It's definitely possible, but for it will require some work as now the TangoRosStreamer is based on Tango's c-api which is currently not available yet for ARCore. Let me get in contact with the Google Tango team to assess the possibility of a c-api release. If that is not the case we would have to basically rewrite TangoRosStreamer on top op ARCore's Java API

eric-schleicher commented 6 years ago

Thx, I'll stay tuned.

jimwhite commented 6 years ago

End of Tango announced today with ARCore being the new new thing (https://blog.google/products/google-vr/arcore-developer-preview-2/). Any update on prospects for a ROS Streamer for ARCore?

eric-schleicher commented 6 years ago

At this point it would make sense to just re-write a really trimmed down version in java. :(

daranguiz commented 6 years ago

There was a C API introduced for ARCore in their v2 release (https://developers.google.com/ar/reference/c/). Any thoughts there?

eric-schleicher commented 6 years ago

This is still very compelling (streaming device poses from an android into ROS). I just lack the c chops to implement it (sad face)

daranguiz commented 6 years ago

Quick update, I just wrote my own ROS node in Java. It's not great (only 5Hz or so), but it gets the job done for our data. I can't share the code, but I'm using Float64MultiArray to handle the pose and Image to handle the camera data via rosjava.

eric-schleicher commented 6 years ago

The 5 hz part, was that an arbitrary choice? or some external limitation.

(edit) perhaps because you're sending the image as well?

daranguiz commented 6 years ago

Yep, I'm sending the image too. You could probably go full rate (30Hz) if you're sending over USB via USB tethering (enable USB tethering, use the appropriate IP address on your desktop) instead of over wifi, but I didn't try going any faster.

eric-schleicher commented 6 years ago

In a different example; using javascript/webvr, going from a browser i was able to push poses over websockets using roslibjs to ros_bridge at about 100hz (no photos) IIRC it was about 32kb/s

unfortunately that wasn't ARcore, but networking wise should be plenty of speed to move something simple like poses.

daranguiz commented 6 years ago

Oh, yeah. If you're just streaming poses, you can definitely go at the ARCore max rate (30Hz, same as the GL update rate). I'm streaming a 40Hz topic alongside the image topic and my subscriber is receiving the updates in real time.

eric-schleicher commented 6 years ago

@daranguiz if you can't share your project, can you point me to how to get the RGB image from the camera? I'm building effectively the same (posting to ROS the pose and rgb). I see how to get the pose, but not the actual camera texure (in the HelloAR example)

the divergence here from streamer here is only to keep it known that ArCore/ROS integration is a need.

daranguiz commented 6 years ago

@eric-schleicher have you looked at the new computer vision sample app released alongside the ARCore v2 developer preview? Link is here. They implement a way to grab the camera texture and save it to CPU-accessible memory in the /utility folder, but for usage, see line 259 of MainActivity. CameraImageBuffer is just a wrapper class with an internal ByteBuffer. You should be able to convert that to a bitmap and send via the sensor_msgs/Image topic.

Sorry to be so cagey - let me know if this works for you!

eric-schleicher commented 6 years ago

No worries! that helps a bunch!