Closed hamishwillee closed 5 years ago
What support we have for computer vision and obstacle avoidance in PX4/Dronecode platform? I am after enough info so that someone who wanted to add obstacle avoidance (for example) to their drones could understand:
Docs on VIO
Docs on obstacle avoidance:
My understanding is that there are two main applications for computer vision:
What I think happens is that VIO requires an external system that supplies position and pose information to PX4. PX4 can then be set up to use this information by telling the estimator to fuse the information from the external source.
ATT_POS_MOCAP
and VISION_POSITION_ESTIMATE
. What is the "difference"/when is one used and not the other? Which ones does PX4 use?My further understanding is that the external source of the messages can be "anything" - ie a black box.. However the supported/documented mechanism is:
My guess is that it works much the same way as VIO - there is some stream of messages that you can send to the vehicle to tell it that it needs to move in a particular way irrespective of current navigation mode.
@LorenzMeier @baumanta , @vilhjalmur89, @mrivi , @JonasVautherin I was wondering if you could help me understand our computer vision story so I can improve the docs/entry points on the user and devguide.
There are some documents already, but they all assume that you understand the architecture already. I want to assume a user who knows nothing and wants to be able to understand the integration points and what they need - how it works, hardware, software ...
All my questions here: https://github.com/PX4/Devguide/issues/530#issuecomment-401668976
If you can't answer, can you please point me to others who might be able to help?
@hamishwillee Will it make sense to draw a big picture with all the critical components of computer vision -
Individual pages can be dedicated for each algorithm. Even if it gets repetitive, its better to put separate pages for each algo (unlike https://dev.px4.io/en/ros/external_position_estimation.html)
Lastly, a page on "Deep Learning for Computer Vision" can be added. If we do not have working examples, it can serve as a placeholder for future.
@lbegani Thanks very much for responding.
A diagram would probably help, but I won't be able to comment more on structure on until someone answers to my questions above.
My gut feeling though is that right now I don't want to explain every possible component of the system and have a breakdown of the possible paths. I want to explain what we have now, and how you can get up and running. Can you take a shot at answering any of my questions?
My shot. I might be wrong, you would still need comments from experts -
- What other docs on VIO/Obstacle avoidance do we have?
https://dev.px4.io/en/tutorials/optical_flow.html https://docs.px4.io/en/flight_controller/intel_aero.html
- Other than VIO/obstacle avoidance, what is computer vision useful for?
OPTICAL Flow? Its not a part of VIO.
- Is that correct? If not, what am I missing?
There is an ODOMETRY message declared in MAVLink but yet to be handled in PX4
- There seems to be overlap between ATT_POS_MOCAP and VISION_POSITION_ESTIMATE. What is the "difference"/when is one used and not the other? Which ones does PX4 use?
I think the system will be setup to output only one of them.
- OPTICAL_FLOW provides altitude and position info - is this also fused with the other information?
OPTICAL_FLOW provides displacement info. Distance sensor provides altitude info.
- I assume that the information from the messages will be fused irrespective of mode (ie you don't have to be running in offboard mode). Is that correct?
Correct.
- If not running in offboard mode are there any constraints/requirements for the external system for supplying the information (ie data rates etc?)
Not sure if there are any constraint other than correct values and low latency
- Is that last point correct? As in 4 it isn't clear which message you would send if you wanted to write your own mavlink service for this.
I think the algo running in companion board will take input from sensors and output that algo-specific MAVLink message. Can we have multiple algorithms running simultaneously giving position as output? I dont think so.
- My understanding is that there is no PX4-only VIO integration - ie you can't connect a stereo camera to PX4 port and from then on have a reliable position/pose estimate. Is that correct?
Correct. PX4 cannot take visual data as input and give pose estimation output.
- If this is correct, is there any documentation about the protocol?
Not sure.
I think the overall focus should be on what we have robustly working today and document that well so people can reliably reproduce our results.
@hamishwillee For obstacle avoidance, right now the only supported communication interface is the offboard one. So the drone needs to be in Offboard mode and from the obstacle avoidance module the setpoints are sent via the SET_POSITION_TARGET_LOCAL_NED
mavlink message. The translation between the ROS messages and mavlink message is done by the MAVROS local position plugin.
As soon as this Fimware PR (https://github.com/PX4/Firmware/pull/9270) gets merged there will be a new interface along side offboard.
The fcu can send goals to the obstacle avoidance through the TRAJECTORY
mavlink message and trajectory mavros plug in and the obstacle avoidance sends back with the collision free waypoints through the same mavlink message and mavros plugin.
The TRAJECTORY
mavlink message enables to describe both waypoints and trajectories. Currently the firmware support only waypoints. The message can contain up to 5 waypoints but currently they aren't all used. Each waypoint is described by position, velocity, acceleration, yaw and yaw_speed (not all the fields need to be filled)
Message from FCU to obstacle avoidance (Firmware uORB topic vehicle_trajectory_waypoint_desired
)
Message from avoidance to FCU (Firmware uORB topic vehicle_trajectory_waypoint
)
This interface can be theoretically used in any mode. However so far the above mentioned PR restricts the usage to mission and rtl. To enable the interface the parameter MPC_OBS_AVOID
need to be set to true in QGC.
I guess my description is quite messy, let me know where I need to clarify.
@mrivi Thanks very much - that helps a hell of a lot - especially with the linked design docs. I'm sure I'll have a lot of questions. Here are just a few:
For obstacle avoidance, right now the only supported communication interface is the offboard one. So the drone needs to be in Offboard mode and from the obstacle avoidance module the setpoints are sent via the SET_POSITION_TARGET_LOCAL_NED mavlink message. The translation between the ROS messages and mavlink message is done by the MAVROS local position plugin.
For the new solution and old solution.
The obstacle avoidance module obviously needs to have a picture of obstacles.
At the moment the interface appears to be over MAVLink using the TRAJECTORY messages, with ROS then converting these into something else. You have told me the internal uORB messages that PX4 uses - I assume that the plan in future is that we might use RTPS/ROS2 to directly share these with ROS?
Sorry, my questions in response are a bit random too. Essentially I'm trying to dig the detail and work out how someone would set this up themselves from end to end, using the solution right now, and as delivered by (PX4/Firmware#9270
PS Thanks @lbegani I think I'll come back to the VIO bit later.
goal_x_param
, goal_y_param
, goal_z_param
in the launch script of the local/global planner or set goal_z_param
in the launch script and then set the x, y in Rviz by clicling where you want to go in the environment representation (for the local planner this step is described in the README)
b) yes, the avoidance gets the drone position through the mavros topic /mavros/local_position/pose
and sends the waypoints through /mavros/setpoint_position/local
. The mavros node maps the geometry_msgs::PoseStamped
ROS message that has been sent on the topic /mavros/setpoint_position/local
to the mavlink message SET_POSITION_TARGET_LOCAL_NED
.The input to both obstacle avoidance algorithms is a point cloud. Currently we are testing with Intel Realsense. Intel provides a ROS node to access their librealsense API so the planner needs only to listen to the provided topic.
Yes, the obstacle avoidance is a ROS node.
Flow of information with the new interface:
PX4 Firmware:drone current state, desired goal uORB vehicle_trajectory_waypoint_desired
->
in mavlink messages uORB is mapped to MAVLINK TRAJECTORY
message
-> MAVROS: trajectory plugin converts to ros message mavros_msgs::Trajectory
-> avoidance ROS node subscribes to /mavros/trajectory/desired
-> Avoidance plan a collision free path
-> Avoidance publishes messages of type mavros_msgs::Trajectory
on /mavros/trajectory/generated
or messages of type nav_msgs::Path
on /mavros/trajectory/path
-> MAVROS: trajectory plugin transforms the mavros_msgs::Trajectory
or nav_msgs::Path
into mavlink TRAJECTORY message
-> PX4 Firmware: mavlink receiver maps TRAJECTORY message to vehicle_trajectory_waypoint
-> pos control tracks the waypoints
There is also a OBSTACLE_DISTANCE
mavlink message that enables to sends information on the distance of the obstacle 360 around a drone with a max resolution of 5 deg on the azimuth angle. Elevation is all squished into a bucket. No use of this message is in the Fimware. The plan is to use it to do a basic sense&stop feature in the firmware
@hamishwillee Hope this clarify some things. Feel free to keep asking questions :)
@hamishwillee Hope this clarify some things. Feel free to keep asking questions :)
Thanks @mrivi - it does, and I will [evil snigger]. I'm mostly committed to MAVLink stuff and general external interfaces now, so might not get back to this until Monday.
Just a few now. So I think (on scan) above is enough to understand how things work, but not to set up a system to do this. Does the team have turnkey instructions for your current setup, or can you help create them?
Essentially this page was about explaining what we offer, with plans to offlink to other docs for key information. It makes sense for the team doing the work to document their setup for that linked page. I can certainly help with review and structure once the information is created. Thoughts?
@hamishwillee yes, we're currently testing on Aero with realsense. ok, I'll discuss with Tanja how to start documenting the HW setup.
@baumanta has documented the Aero setup here https://docs.px4.io/en/flight_controller/intel_aero.html
@mrivi Thanks for that. I was aware of that doc, but did not remember that the setup covered this aspect. I'll try get my head around all of this during the week and create an introductory doc you can review.
Hi @hamishwillee , I would like to help bring the obstacle avoidance interface into the documentation. How can I help?
Hi @mrivi ,
Apologies. This fell off my priority list. Let's start by clarifying how the architecture has changed/how it is now. I see some churn :-)
Previously I believe you said:
TRAJECTORY
to a companion, a ROS node would process the trajectory against a vision generated map of the path and send the vehicle the actual avoidance path to take in another TRAJECTORY
. The firmware makes the move in some set of modes.But I have seen a bit of churn on github, so I suspect that has changed
So basically we need to know how things work now, and further
How we proceed depends on the answers to above. But assume things were as previously I would actually have started by documenting the mavlink protocol for object avoidance - ie "generically" similar to https://mavlink.io/en/services/mission.html
Hi @hamishwillee, I'll try to answer best to my knowledege:
@baumanta Thank you. I'd better wait for @mrivi because TRAJECTORY message no longer exists, which implies that lots of other things might have changed.
We should document the new collision avoidance behaviour too. I will discuss that on the PR.
@hamishwillee mavlink TRAJECTORY
had to different types bezier or waypoint. We have restructure them to TRAJECTORY_REPRESENTATION_WAYPOINTS
and TRAJECTORY_REPRESENTATION_BEZIER
. The fields are the same as in the old TRAJECTORY
message.
I am preparing a description of what is implemented. I'll post it here as soon as it's ready.
@hamishwillee
Mission Mode - Obstacle Avoidance Interface
When a mission is uploaded from QGC and the parameter MPC_OBS_AVOID is set to True, the Firmware fills the uORB message vehicle_trajectory_waypoint_desired
in the following way.
Array waypoints
:
index 0 :
index 1:
Index2:
The remaining indices are filled with NaN.
The message vehicle_trajectory_waypoint_desiredis mapped into the Mavlink message
TRAJECTORY_REPRESENTATION_WAYPOINTS`. The messages are sent at 5Hz.
MAVROS translates the Mavlink message into a ROS message called mavros_msgs::trajectory and does the conversion from NED to ENU frames. Messages are published on the ROS topic /mavros/trajectory/desired
On the avoidance side, the algorithm plans a path to the waypoint.
The position or velocity setpoints generated by the obstacle avoidance to get collision free to the waypoint can be sent to the Firmware with two ROS messages: mavros_msgs::trajectory (both velocity and position set points) on ROS topic /mavros/trajectory/generated nav_msgs::Path (only position setpoints) on ROS topic /mavros/trajectory/path
MAVROS converts the set points from ENU to NED frame and translates the ROS messages into a MAVLINK message TRAJECTORY_REPRESENTATION_WAYPOINTS
.
On the Firmware side, incoming TRAJECTORY_REPRESENTATION_WAYPOINTS
are translated into uORB vehicle_trajectory_waypoint
messages. The array waypoints contains all NAN expect for index 0:
The setpoints are tracked by the multicopter position controller.
Mission Progression The mission logic is handled by the navigator in the same as for flight without obstacle avoidance a part for two differences:
Easy to discover landing page for all things computer vision. Expectation is that you can go to the dev guide and have everything laid out about all components that can be leveraged.
This should also be linked from user guide as a concept.
Link or move docs for into Developer guide.
Other resources: