cvar-upm / cvg_ardrone2_ibvs

Vision based control for object following in UAVs
http://robotics.asu.edu/ardrone2_ibvs/
Other
85 stars 46 forks source link

Pixhawk as an alternative to AR drone #10

Open AloshkaD opened 8 years ago

AloshkaD commented 8 years ago

Hi, Can you advise on the easiest way to replicate the same experiment on a pixhawk drone.

Thanks,

jespestana commented 8 years ago

Hi Ale,

First, you will need an onboard computer on your drone with ROS installed (which I guess you have). Otherwise, you need to redo the system integration work: change the image that you are using to the one acquired by your drone, change the control messages to the ones listened to by your drone, etc.

You are probably going to need to do some controller tuning.

Other than me, my colleague Hriday Bavle (https://github.com/hridaybavle) can also provide some directions. He already did this work with an Asctec Pelican. So, you can just ask more specific question along the way as you encounter other problems.

On Wed, Jul 6, 2016 at 7:50 AM, Ale notifications@github.com wrote:

Hi, Can you advise on the easiest way to replicate the same experiment on a pixhawk drone.

Thanks,

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/Vision4UAV/cvg_ardrone2_ibvs/issues/10, or mute the thread https://github.com/notifications/unsubscribe/ACwb5rnTzBORudX5iDp1rZqSnendwLvvks5qS0I-gaJpZM4JFxhw .

hridaybavle commented 8 years ago

Hi Ale,

The Pixhawk is integrated with current version of Aerostack that we are using, so sending commands to the pixhawk from the controllers is implemented. What we have not tested is doing visual servoing using pixhawk. But it wont be very trivial, if you download the Aerostack, go inside launchers, and you will need to change the branch of launchers to master.There you will have a launchers called pixhawk launchers with an sh file called pixhawk_real_flight.sh

In this sh files you can add the nodes from the current sh files you are launching i.e parrot_IBVSController_launcher_Release.sh

  1. IBVSController node
  2. openTLD
  3. openTLD translator
  4. tracker Eye
  5. openTLDGUI

Well you will also need to mount a camera and publish the camera image as done in the ardrone.

AloshkaD commented 8 years ago

jespestana and hridaybavle, Thank you a million for your quick response.

I do have a board running ubuntu 14 and ROS Jade connected to Pixhawk. I will test what hridaybavle recomended and update this answer to help other developers with this integration.

lalal321 commented 5 years ago

@AloshkaD Hello , how's the going with the process of implement this project to Pixhawk, I am also very interested in this. Would mind to share your experience? Thank you very much.

Arsalan66 commented 4 years ago

Salam and hi Sir , I hope you are doing well,I am currently in 6th semester of Bachelors of electronics engineering , the reason that I'm writing comment is that I'm using PX4 software for an IRIS drone and have successfully implemented convolutional neural networks to detect objects of interest in ros, now i want to update my script to implement IBVS on crops vs non-crops, can you kindly me the way to update this script according to my needs? I shall really be grateful for you for any algorithm or any reading material you provide and I shall await your esteemed response.