NVIDIA-AI-IOT / redtail

Perception and AI components for autonomous mobile robotics.
BSD 3-Clause "New" or "Revised" License
1.01k stars 344 forks source link

Thanks for the Rover support #11

Open griz1112 opened 7 years ago

griz1112 commented 7 years ago

Thanks for the rover support. It arrived today so anxious to get this on it and start testing.

Alexey-Kamenev commented 7 years ago

Sure, let us know how it goes. Please note that the rover support is very basic/primitive: essentially direct PWM control via MAVROS RC override messages. Here is the example of the rover in action. The main goal is to show that it should be possible to enable APM support with relatively few changes. If your rover can use sensors like GPS or optical flow etc to get a pose then our standard waypoint control should work as well. We haven't tested it on the rover though let alone any APM-based vehicle...

griz1112 commented 7 years ago

Kancier Pierre is the guy that does the ROS interfacing with APM. The rover code just underwent a major upgrade. He is checking out the ROS part now. Should be able to use setpoint_velocity plugin that uses cmd_vel once that is finished. That plugin uses r/c override too but its simpler to use. I have lots of sensors to try. Zed Realsense 3 different lidars and a radar unit for OA. Rover came in Sat. Finished wiring it last night. Has a TX2 and J120. I thought that model had a good chance of working with a rover since its shot at low altitude. This is the ArduRover R1. Comes with Pixhawk cube 2.1. Roboclaw motor controller. Just started selling them on the 1st of this month. Ready to run with Mission Planner or a radio right out of the box. Has Apsync for streaming video and telemetry tuning etc via webpage and wifi hotspot it creates on the companion computer. Nick Nunno designed them I help with the computer side of things. So this is what I want to implement your work on. I think its going to be a really nice system for hobbyists and schools.

On Oct 16, 2017, at 10:54 AM, Alexey Kamenev notifications@github.com wrote:

Sure, let us know how it goes. Please note that the rover support is very basic/primitive: essentially direct PWM control via MAVROS RC override messages. Here is the example of the rover in action https://www.youtube.com/watch?v=ZKF5N8xUxfw&t=127. The main goal is to show that it should be possible to enable APM support with relatively few changes. If your rover can use sensors like GPS or optical flow etc to get a pose then our standard waypoint control should work as well. We haven't tested it on the rover though let alone any APM-based vehicle...

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-336931590, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTgxVcYh-zW8F8eqI89Du68prDPduUks5ss3wjgaJpZM4P5jtv.

griz1112 commented 7 years ago

https://youtu.be/Jado7z2cwhc https://youtu.be/Jado7z2cwhc. First test

On Oct 16, 2017, at 10:54 AM, Alexey Kamenev notifications@github.com wrote:

Sure, let us know how it goes. Please note that the rover support is very basic/primitive: essentially direct PWM control via MAVROS RC override messages. Here is the example of the rover in action https://www.youtube.com/watch?v=ZKF5N8xUxfw&t=127. The main goal is to show that it should be possible to enable APM support with relatively few changes. If your rover can use sensors like GPS or optical flow etc to get a pose then our standard waypoint control should work as well. We haven't tested it on the rover though let alone any APM-based vehicle...

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-336931590, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTgxVcYh-zW8F8eqI89Du68prDPduUks5ss3wjgaJpZM4P5jtv.

Alexey-Kamenev commented 7 years ago

This is cool, thanks for sharing! Does the rover have Jetson with Redtail running already (I could not see it from the video)?

griz1112 commented 7 years ago

Not yet but close. I just got it Sat. And there was a shortage of pix hawks so I didn’t get one went to a customer so I had to install all that. Just about to get to redtail. Those rovers sell for 800 bucks ready to go. Been dealing with a few issues on the rovers so haven’t had time to concentrate on it.

But the rover is going to be at Nvidia Showcase in Nov and I’m going to do my best to have it running by then for the show. Might need to ask lots of questions :)

I don’t have a Earle-brain but I do have a navio2 pretty similar from what I’ve read. So between the two should get something going. And I can get in touch with Pierre for any problems so he can help fix them.

There are a couple more guys that have contacted me on FB that are trying to get it running as well.

I was just using a xbox controller over the 2nd channel of the telemetry radio to control it. It has 10Km range :) Up to 40km with the right antennas. You just create dsm packets and send them down a slip/ppp connection.

There have been several inquiries for projects that redtail would be perfect for. So we want to get it up and running as quickly as possible. If I have too I’ll use px4 for the demo no-one will know but me :)

I built my camera array on 1/2” carbon tube. A pair of small clamps are on top of the rover and it just clamps to that. I was afraid the motor spikes on the scooter would wreck one of my nvidias so just waited until the ardurover was here. I’m just using 3 webcams but they have adjustable focus and seem to work ok. Thanks again for putting this up it would have taken me forever to get to this stage :)

This is my other rover its a pure ROS bot. Having a lot of issues with wheel alignment though. Doesn’t run straight and gets the navigation all messed up trying to correct it. Need to design some motor holders that allow me to adjust the toe in. Its all components from servo-city pretty much. Only takes 1 Allen wrench and a screwdriver to assemble it. It has a TX1 in it so redtail capable too. I have a TX2 on a j120 in the new rover.

On Oct 19, 2017, at 7:25 PM, Alexey Kamenev notifications@github.com wrote:

This is cool, thanks for sharing! Does the rover have Jetson with Redtail running already (I could not see it from the video)?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-338073393, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTg4qDxGMP9-A35OGzkdNDHNh-cFBMks5st-h2gaJpZM4P5jtv.

griz1112 commented 7 years ago

Finished the set-up last night. Couldn’t afford a Earle brain so I used a navio2 and PI III. Raining today so can’t test it but the rest of the week is supposed to be nice. Do you think it would recognize two rows of orange cones instead of having to have a full wall on each side?

On Oct 19, 2017, at 7:25 PM, Alexey Kamenev notifications@github.com wrote:

This is cool, thanks for sharing! Does the rover have Jetson with Redtail running already (I could not see it from the video)?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-338073393, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTg4qDxGMP9-A35OGzkdNDHNh-cFBMks5st-h2gaJpZM4P5jtv.

Alexey-Kamenev commented 7 years ago

We used Erle Brain primarily because it came as default with Erle Rover. You could replace it with any other flight controller like Pixhawk etc.

As for the cones: that would be an interesting experiment. I have a feeling it probably won't drive well with default network that was trained on forest images so you might need to do some fine-tuning of the network with your dataset.

griz1112 commented 7 years ago

Ran into a show stopper with apm. The recent changes have altered the way it works so its not responding. So I’m gonna switch over to PX4. How much difference is there between the two? How much code would need to be modified? Seems to me I could use the drone code to interface to px4 and the model part would not need changing. Am I on the right track? Actually I think px4 is much better written. More modular and easier to add drivers for other devices. I’ve had much less difficulty working with it than APM. But I’m in a sticky posit position since the ArduRover is supposed to run apm. I’ve had the pixhawk2 cube for a year now and so far they haven’t been able to get the ROS interface working. Not happy about it holding up my work in a big way. I have a Roboteq controller and a ROS stack to run it. Wondering if it wouldn’t be easier just to integrate redtail into that. My friend Rafaello Bonghi’s GitHub has a complete ROS stack with perception lidar and rtabmap for SLAM ready to go. So I’m going to look at those two options while the app folks get their act together.

On Oct 24, 2017, at 11:08 AM, Alexey Kamenev notifications@github.com wrote:

We used Erle Brain primarily because it came as default with Erle Rover. You could replace it with any other flight controller like Pixhawk etc.

As for the cones: that would be an interesting experiment. I have a feeling it probably won't drive well with default network that was trained on forest images so you might need to do some fine-tuning of the network with your dataset.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-339043883, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTgyUt5e_prqORlsYYjP7mPdnLHMVXks5svguOgaJpZM4P5jtv.

griz1112 commented 7 years ago

Up and running I’ll be testing it in the next couple of days.

On Oct 24, 2017, at 11:08 AM, Alexey Kamenev notifications@github.com wrote:

We used Erle Brain primarily because it came as default with Erle Rover. You could replace it with any other flight controller like Pixhawk etc.

As for the cones: that would be an interesting experiment. I have a feeling it probably won't drive well with default network that was trained on forest images so you might need to do some fine-tuning of the network with your dataset.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-339043883, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTgyUt5e_prqORlsYYjP7mPdnLHMVXks5svguOgaJpZM4P5jtv.

griz1112 commented 7 years ago

Its all up and working now found a forest trail here in town to test with. Do you think the bridge is doable ? Jetson and pixhawk2 serial connection.

On Oct 24, 2017, at 11:08 AM, Alexey Kamenev notifications@github.com wrote:

We used Erle Brain primarily because it came as default with Erle Rover. You could replace it with any other flight controller like Pixhawk etc.

As for the cones: that would be an interesting experiment. I have a feeling it probably won't drive well with default network that was trained on forest images so you might need to do some fine-tuning of the network with your dataset.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-Jetson/redtail/issues/11#issuecomment-339043883, or mute the thread https://github.com/notifications/unsubscribe-auth/ARSTgyUt5e_prqORlsYYjP7mPdnLHMVXks5svguOgaJpZM4P5jtv.

robonrrd commented 7 years ago

TrailNet has successfully flown our MAV over a small wooden bridge up here in Washington. We've also run our forest-trained model on a rover indoors, in our office hallways, with no retraining. So the generalization we've observed has been very good, but that's no guarantee..