PX4 / PX4-Devguide

PX4 Devguide GitBook
http://dev.px4.io
Other
198 stars 479 forks source link

Computer Vision Landing Page #530

Closed hamishwillee closed 5 years ago

hamishwillee commented 6 years ago

Easy to discover landing page for all things computer vision. Expectation is that you can go to the dev guide and have everything laid out about all components that can be leveraged.

This should also be linked from user guide as a concept.

Link or move docs for into Developer guide.

Other resources:

hamishwillee commented 6 years ago

What support we have for computer vision and obstacle avoidance in PX4/Dronecode platform? I am after enough info so that someone who wanted to add obstacle avoidance (for example) to their drones could understand:

What docs do we have on these topics?

Docs on VIO

Docs on obstacle avoidance:

  1. What other docs on VIO/Obstacle avoidance do we have?

What is Computer Vision for?

My understanding is that there are two main applications for computer vision:

  1. Other than VIO/obstacle avoidance, what is computer vision useful for?

How does VIO work on PX4? (Summary)

What I think happens is that VIO requires an external system that supplies position and pose information to PX4. PX4 can then be set up to use this information by telling the estimator to fuse the information from the external source.

  1. Is that correct? If not, what am I missing?
  2. There seems to be overlap between ATT_POS_MOCAP and VISION_POSITION_ESTIMATE. What is the "difference"/when is one used and not the other? Which ones does PX4 use?
  3. OPTICAL_FLOW provides altitude and position info - is this also fused with the other information?
  4. I assume that the information from the messages will be fused irrespective of mode (ie you don't have to be running in offboard mode). Is that correct?
  5. If not running in offboard mode are there any constraints/requirements for the external system for supplying the information (ie data rates etc?)

My further understanding is that the external source of the messages can be "anything" - ie a black box.. However the supported/documented mechanism is:

  1. Is that last point correct? As in 4 it isn't clear which message you would send if you wanted to write your own mavlink service for this.
  2. My understanding is that there is no PX4-only VIO integration - ie you can't connect a stereo camera to PX4 port and from then on have a reliable position/pose estimate. Is that correct?

How does Obstacle avoidance work on PX4?

My guess is that it works much the same way as VIO - there is some stream of messages that you can send to the vehicle to tell it that it needs to move in a particular way irrespective of current navigation mode.

  1. If this is correct, is there any documentation about the protocol?

Hardware and Software required.

hamishwillee commented 6 years ago

@LorenzMeier @baumanta , @vilhjalmur89, @mrivi , @JonasVautherin I was wondering if you could help me understand our computer vision story so I can improve the docs/entry points on the user and devguide.

There are some documents already, but they all assume that you understand the architecture already. I want to assume a user who knows nothing and wants to be able to understand the integration points and what they need - how it works, hardware, software ...

All my questions here: https://github.com/PX4/Devguide/issues/530#issuecomment-401668976

If you can't answer, can you please point me to others who might be able to help?

lbegani commented 6 years ago

@hamishwillee Will it make sense to draw a big picture with all the critical components of computer vision -

  1. Sensors (Monocular, Stereo, IMU, Mag, LIDAR etc)
  2. Sensor Drivers (ROS Nodes etc)
  3. Visual Algorithms (Optical Flow, VIO, Obstacle Avoidance etc)
  4. Messaging Channels (MAVROS, mavlink-router, UART etc)
  5. MAVLink Messages (V_P_E, O_F_R, O_D etc)
  6. PX4 (EKF, LPE etc)

Individual pages can be dedicated for each algorithm. Even if it gets repetitive, its better to put separate pages for each algo (unlike https://dev.px4.io/en/ros/external_position_estimation.html)

  1. Explain the Algorithm
  2. Target platform
  3. Setup steps

Lastly, a page on "Deep Learning for Computer Vision" can be added. If we do not have working examples, it can serve as a placeholder for future.

hamishwillee commented 6 years ago

@lbegani Thanks very much for responding.

A diagram would probably help, but I won't be able to comment more on structure on until someone answers to my questions above.

My gut feeling though is that right now I don't want to explain every possible component of the system and have a breakdown of the possible paths. I want to explain what we have now, and how you can get up and running. Can you take a shot at answering any of my questions?

lbegani commented 6 years ago

My shot. I might be wrong, you would still need comments from experts -

  1. What other docs on VIO/Obstacle avoidance do we have?

https://dev.px4.io/en/tutorials/optical_flow.html https://docs.px4.io/en/flight_controller/intel_aero.html

  1. Other than VIO/obstacle avoidance, what is computer vision useful for?

OPTICAL Flow? Its not a part of VIO.

  1. Is that correct? If not, what am I missing?

There is an ODOMETRY message declared in MAVLink but yet to be handled in PX4

  1. There seems to be overlap between ATT_POS_MOCAP and VISION_POSITION_ESTIMATE. What is the "difference"/when is one used and not the other? Which ones does PX4 use?

I think the system will be setup to output only one of them.

  1. OPTICAL_FLOW provides altitude and position info - is this also fused with the other information?

OPTICAL_FLOW provides displacement info. Distance sensor provides altitude info.

  1. I assume that the information from the messages will be fused irrespective of mode (ie you don't have to be running in offboard mode). Is that correct?

Correct.

  1. If not running in offboard mode are there any constraints/requirements for the external system for supplying the information (ie data rates etc?)

Not sure if there are any constraint other than correct values and low latency

  1. Is that last point correct? As in 4 it isn't clear which message you would send if you wanted to write your own mavlink service for this.

I think the algo running in companion board will take input from sensors and output that algo-specific MAVLink message. Can we have multiple algorithms running simultaneously giving position as output? I dont think so.

  1. My understanding is that there is no PX4-only VIO integration - ie you can't connect a stereo camera to PX4 port and from then on have a reliable position/pose estimate. Is that correct?

Correct. PX4 cannot take visual data as input and give pose estimation output.

  1. If this is correct, is there any documentation about the protocol?

Not sure.

LorenzMeier commented 6 years ago

I think the overall focus should be on what we have robustly working today and document that well so people can reliably reproduce our results.

mrivi commented 6 years ago

@hamishwillee For obstacle avoidance, right now the only supported communication interface is the offboard one. So the drone needs to be in Offboard mode and from the obstacle avoidance module the setpoints are sent via the SET_POSITION_TARGET_LOCAL_NED mavlink message. The translation between the ROS messages and mavlink message is done by the MAVROS local position plugin. As soon as this Fimware PR (https://github.com/PX4/Firmware/pull/9270) gets merged there will be a new interface along side offboard. The fcu can send goals to the obstacle avoidance through the TRAJECTORY mavlink message and trajectory mavros plug in and the obstacle avoidance sends back with the collision free waypoints through the same mavlink message and mavros plugin. The TRAJECTORY mavlink message enables to describe both waypoints and trajectories. Currently the firmware support only waypoints. The message can contain up to 5 waypoints but currently they aren't all used. Each waypoint is described by position, velocity, acceleration, yaw and yaw_speed (not all the fields need to be filled)

Message from FCU to obstacle avoidance (Firmware uORB topic vehicle_trajectory_waypoint_desired)

Message from avoidance to FCU (Firmware uORB topic vehicle_trajectory_waypoint)

This interface can be theoretically used in any mode. However so far the above mentioned PR restricts the usage to mission and rtl. To enable the interface the parameter MPC_OBS_AVOID need to be set to true in QGC.

I guess my description is quite messy, let me know where I need to clarify.

hamishwillee commented 6 years ago

@mrivi Thanks very much - that helps a hell of a lot - especially with the linked design docs. I'm sure I'll have a lot of questions. Here are just a few:

For obstacle avoidance, right now the only supported communication interface is the offboard one. So the drone needs to be in Offboard mode and from the obstacle avoidance module the setpoints are sent via the SET_POSITION_TARGET_LOCAL_NED mavlink message. The translation between the ROS messages and mavlink message is done by the MAVROS local position plugin.

  1. When is the new solution likely to deliver?
  2. Is the current offboard mode solution really "alongside" or will there be a move to make it work with the same interface as the other modes? If so we have to document this.
  3. There really isn't enough to completely understand how the offboard solution works.
    • What is defining the original path in this case, and how does that information get to the obstacle avoidance system. Or to put it another way, say I write a Dronecode SDK app to drive my vehicle in offboard mode, how does it integrate with the obstacle system. ?
    • I envisage that ROS gets vehicle pose and movement, and sends this along with planned path to obstacle avoidance module. Obstacle avoidance module works out anchor points to avoid obstacles, and sends to the trajectory library, trajectory library sends out SET_POSITION_TARGET_LOCAL_NED for the new path.
  4. Where it says "The translation between the ROS messages and mavlink message is done by the MAVROS local position plugin."
    • At what point do you need to do this translation?

For the new solution and old solution.

The obstacle avoidance module obviously needs to have a picture of obstacles.

At the moment the interface appears to be over MAVLink using the TRAJECTORY messages, with ROS then converting these into something else. You have told me the internal uORB messages that PX4 uses - I assume that the plan in future is that we might use RTPS/ROS2 to directly share these with ROS?

Sorry, my questions in response are a bit random too. Essentially I'm trying to dig the detail and work out how someone would set this up themselves from end to end, using the solution right now, and as delivered by (PX4/Firmware#9270

hamishwillee commented 6 years ago

PS Thanks @lbegani I think I'll come back to the VIO bit later.

mrivi commented 6 years ago
  1. Unfortunately, no clue. I wouldn't expect any time soon at the speed things are evolving
  2. For now there is no plan to change offboard
  3. a) In offboard there are two ways of setting the goal: setting the parameters goal_x_param, goal_y_param, goal_z_param in the launch script of the local/global planner or set goal_z_param in the launch script and then set the x, y in Rviz by clicling where you want to go in the environment representation (for the local planner this step is described in the README) b) yes, the avoidance gets the drone position through the mavros topic /mavros/local_position/pose and sends the waypoints through /mavros/setpoint_position/local. The mavros node maps the geometry_msgs::PoseStamped ROS message that has been sent on the topic /mavros/setpoint_position/local to the mavlink message SET_POSITION_TARGET_LOCAL_NED.
  4. the user doesn't have to do the translation. It has to launch the MAVROS node and use the defined topics to send messages to and from this node. I don't think that mavros plugins are documented anywhere. A user has to go through the code to understand how the messages are mapped.

The input to both obstacle avoidance algorithms is a point cloud. Currently we are testing with Intel Realsense. Intel provides a ROS node to access their librealsense API so the planner needs only to listen to the provided topic.

Yes, the obstacle avoidance is a ROS node.

Flow of information with the new interface: PX4 Firmware:drone current state, desired goal uORB vehicle_trajectory_waypoint_desired -> in mavlink messages uORB is mapped to MAVLINK TRAJECTORY message -> MAVROS: trajectory plugin converts to ros message mavros_msgs::Trajectory -> avoidance ROS node subscribes to /mavros/trajectory/desired -> Avoidance plan a collision free path -> Avoidance publishes messages of type mavros_msgs::Trajectory on /mavros/trajectory/generated or messages of type nav_msgs::Path on /mavros/trajectory/path -> MAVROS: trajectory plugin transforms the mavros_msgs::Trajectory or nav_msgs::Path into mavlink TRAJECTORY message -> PX4 Firmware: mavlink receiver maps TRAJECTORY message to vehicle_trajectory_waypoint -> pos control tracks the waypoints

There is also a OBSTACLE_DISTANCE mavlink message that enables to sends information on the distance of the obstacle 360 around a drone with a max resolution of 5 deg on the azimuth angle. Elevation is all squished into a bucket. No use of this message is in the Fimware. The plan is to use it to do a basic sense&stop feature in the firmware

@hamishwillee Hope this clarify some things. Feel free to keep asking questions :)

hamishwillee commented 6 years ago

@hamishwillee Hope this clarify some things. Feel free to keep asking questions :)

Thanks @mrivi - it does, and I will [evil snigger]. I'm mostly committed to MAVLink stuff and general external interfaces now, so might not get back to this until Monday.

Just a few now. So I think (on scan) above is enough to understand how things work, but not to set up a system to do this. Does the team have turnkey instructions for your current setup, or can you help create them?

  1. Hardware - vehicle and require peripherals (I am guessing Intel Aero with realsense camera?)
  2. Software - standard ROS installation on the companion computer. How do we get the notes, how do we start it up?

Essentially this page was about explaining what we offer, with plans to offlink to other docs for key information. It makes sense for the team doing the work to document their setup for that linked page. I can certainly help with review and structure once the information is created. Thoughts?

mrivi commented 6 years ago

@hamishwillee yes, we're currently testing on Aero with realsense. ok, I'll discuss with Tanja how to start documenting the HW setup.

mrivi commented 6 years ago

@baumanta has documented the Aero setup here https://docs.px4.io/en/flight_controller/intel_aero.html

hamishwillee commented 6 years ago

@mrivi Thanks for that. I was aware of that doc, but did not remember that the setup covered this aspect. I'll try get my head around all of this during the week and create an introductory doc you can review.

mrivi commented 6 years ago

Hi @hamishwillee , I would like to help bring the obstacle avoidance interface into the documentation. How can I help?

hamishwillee commented 6 years ago

Hi @mrivi ,

Apologies. This fell off my priority list. Let's start by clarifying how the architecture has changed/how it is now. I see some churn :-)

Previously I believe you said:

But I have seen a bit of churn on github, so I suspect that has changed

So basically we need to know how things work now, and further

How we proceed depends on the answers to above. But assume things were as previously I would actually have started by documenting the mavlink protocol for object avoidance - ie "generically" similar to https://mavlink.io/en/services/mission.html

baumanta commented 6 years ago

Hi @hamishwillee, I'll try to answer best to my knowledege:

hamishwillee commented 6 years ago

@baumanta Thank you. I'd better wait for @mrivi because TRAJECTORY message no longer exists, which implies that lots of other things might have changed.

We should document the new collision avoidance behaviour too. I will discuss that on the PR.

mrivi commented 6 years ago

@hamishwillee mavlink TRAJECTORY had to different types bezier or waypoint. We have restructure them to TRAJECTORY_REPRESENTATION_WAYPOINTS and TRAJECTORY_REPRESENTATION_BEZIER. The fields are the same as in the old TRAJECTORY message. I am preparing a description of what is implemented. I'll post it here as soon as it's ready.

mrivi commented 6 years ago

@hamishwillee

Mission Mode - Obstacle Avoidance Interface

When a mission is uploaded from QGC and the parameter MPC_OBS_AVOID is set to True, the Firmware fills the uORB message vehicle_trajectory_waypoint_desired in the following way.

Array waypoints: index 0 :

index 1:

Index2:

The remaining indices are filled with NaN.

The message vehicle_trajectory_waypoint_desiredis mapped into the Mavlink messageTRAJECTORY_REPRESENTATION_WAYPOINTS`. The messages are sent at 5Hz.

MAVROS translates the Mavlink message into a ROS message called mavros_msgs::trajectory and does the conversion from NED to ENU frames. Messages are published on the ROS topic /mavros/trajectory/desired

On the avoidance side, the algorithm plans a path to the waypoint.

The position or velocity setpoints generated by the obstacle avoidance to get collision free to the waypoint can be sent to the Firmware with two ROS messages: mavros_msgs::trajectory (both velocity and position set points) on ROS topic /mavros/trajectory/generated nav_msgs::Path (only position setpoints) on ROS topic /mavros/trajectory/path

MAVROS converts the set points from ENU to NED frame and translates the ROS messages into a MAVLINK message TRAJECTORY_REPRESENTATION_WAYPOINTS.

On the Firmware side, incoming TRAJECTORY_REPRESENTATION_WAYPOINTS are translated into uORB vehicle_trajectory_waypoint messages. The array waypoints contains all NAN expect for index 0:

The setpoints are tracked by the multicopter position controller.

Mission Progression The mission logic is handled by the navigator in the same as for flight without obstacle avoidance a part for two differences: