go-scout is a tool that allows you to control a Moorebot Scout robot from your computer (without using the mobile app). The robot is controlled using an XBOX controller, and video is displayed in a new window.
26
stars
5
forks
source link
Cool. I am actually working on a similar project using a ROS1 node on a desktop #2
Hi,
I am actually working on a similar project by creating a ROS1 node on a Linux VM running a web gui , displaying the video stream and using a gamepad to control the bot while recording a Navigation Path. I'm still in the reverse engineering phase and playing around a bit. I added espeak-ng with mbrola and the 3 mbrola us voices, mlocate because I wanted to find files easily, mpg321 to play audio files. I will add more. Video storage is on my Qnap. I replaced /var/roller_eye/devAudio/alert/alert2.wav (that's the detect siren sound) with the recording of an annoying small dog yipping. So now when the bot detects a human moving... it barks. When movement stops.. it stops. When I have the Desktop ROS1 node up I will run Rviz on the Desktop and attempt to Mapping with Mono Slam. I think the two disabled CPU cores will need to be enabled.
I plan on getting a LD06 LIDAR. There's several uarts on the bot. if you take the top panel off the bot you will see a square pad with 4 pins. I am going to check to see if I can use the port to interface the LIDAR. The LDO6 is supported in ROS1 and ROS2. Once LIDAR is running navigation will be much more accurate.
Hi, I am actually working on a similar project by creating a ROS1 node on a Linux VM running a web gui , displaying the video stream and using a gamepad to control the bot while recording a Navigation Path. I'm still in the reverse engineering phase and playing around a bit. I added espeak-ng with mbrola and the 3 mbrola us voices, mlocate because I wanted to find files easily, mpg321 to play audio files. I will add more. Video storage is on my Qnap. I replaced /var/roller_eye/devAudio/alert/alert2.wav (that's the detect siren sound) with the recording of an annoying small dog yipping. So now when the bot detects a human moving... it barks. When movement stops.. it stops. When I have the Desktop ROS1 node up I will run Rviz on the Desktop and attempt to Mapping with Mono Slam. I think the two disabled CPU cores will need to be enabled.
If you are looking to something similar here's the link: http://wiki.ros.org/navigation/Tutorials/RobotSetup
I plan on getting a LD06 LIDAR. There's several uarts on the bot. if you take the top panel off the bot you will see a square pad with 4 pins. I am going to check to see if I can use the port to interface the LIDAR. The LDO6 is supported in ROS1 and ROS2. Once LIDAR is running navigation will be much more accurate.