Open zxp771 opened 6 years ago
MAVLink, check examples here: https://github.com/intel/mavlink-router/tree/master/examples Just modify to get the messages that you want: http://mavlink.org/messages/common
Hi @zehortigoza
Thanks for your help. I know how to get those data now it's easy by ros. But I still have some question. 1.Could I change the imu operating frequency by myself.I record the imu/data it seems like its operating frequency is 50hz. I'm wondering if the imu can reach 100 or 200hz. because for most visual-inertial algorithms the data should be collect in this frequency. 2.how can I do the extrinsic parameter calibration(camera between imu)? 3.Does the imu and camera data I get from ros should in synchronization condition?I use rosbag fuction to record these data.
Thanks for your help again.
@zxp771 I am also trying to make visual-inertial odometry work on the intel aero RTF. I have tried ROVIO and SVO 2.0. From my experience the aero compute board does not have enough computational power to run ROVIO, but it can run SVO 2.0 for me.
You have two possibilities to do so:
In a terminal: rosservice call /mavros/set_stream_rate 1 200 1
In C++:
#include <mavros_msgs/StreamRate.h>
ros::ServiceClient stream_rate_client;
ros::NodeHandle nh;
stream_rate_client = nh.serviceClient<mavros_msgs::StreamRate>("/mavros/set_stream_rate");
mavros_msgs::StreamRate streamRate;
streamRate.request.stream_id = 1;
streamRate.request.message_rate = 200;
streamRate.request.on_off = true;
stream_rate_client.call(streamRate);
For me none of these worked, have no idea why. Let me know if you made it.
Hi @szebedy Thanks for your advise. I'll try these methods and will get the result back to you. I have some more question. you said you use the svo2.0 on your drone. which camera you use for it?how do you get the intrinsic parameters for the camera? Do you try other visual/visual-inertial algorithm like ORB-SLAM,VINS-mono? Thanks for your help again.
Hi @szebedy
I tried both of the methods you told me but none of them work. Do you have some new idea? BTW have you done the extrinsic calibration between the imu and camera?
Hi @szebedy I just make it by use this command in terminal"rosrun mavros mavcmd long 511 105 10000 0 0 0 0 0" to boom up the imu rate to 100hz.but actually I don't know the meaning of "10000 0 0 0 0 0" I want to boom up the rate to 200hz for the extrinsic calibration.(I use the software which named kalibr: https://github.com/ethz-asl/kalibr/wiki/calibrating-the-vi-sensor)
Do you try other visual/visual-inertial algorithm like ORB-SLAM,VINS-mono?
@zxp771 I have tried running VINS-MONO. I have used bottom facing camera and BMI160 IMU sensor. The output was visualized in Rviz. I am yet to integrate it with MAVROS to send MAVLink messages to flight stack. VINS-MONO does auto calibration for IMU-Camera extrinsic parameters.
@zxp771 haven't found a way via mavros, but you can boost the mavros IMU-Rate from the a nuttx shell
option 1) via mavproxy.py (if i remember correct its from pip install mavproxy)
mavproxy.py --master=tcp:127.0.0.1:5760 --default=nsh
option 2) via qgroundcontrol > mavlink console
and in this shell execute:
nsh> mavlink stream -s HIGHRES_IMU -r 150 -d /dev/ttyS1
check with rostopic hz /mavros/imu/data_raw
Hi@lbegani Thanks for your reply. How is the performance of the VINS-mono?
Hi@mthz Thanks for your reply. Actually I boost the IMU-Rate with these command above in terminal. I also tried your method.Thanks for your help.
Hi @zxp771
Check out this thread for more information about increasing the IMU rate in mavros.
In the command rosrun mavros mavcmd long 511 105 10000 0 0 0 0 0
you can change the number 10000 to 5000 achieve 200 Hz. This number is the streaming period in microseconds.
Since there is no official data available, I estimated the IMU camera extrinsics based on the coordinate frames and the physical location of the IMU (inside the flight controller) and the bottom camera. However the results with using the IMU were so bad, I had to turn off the usage of the IMU. Let me know if you are able to get better results with the IMU than without.
I attached my calibration files for SVO 2.0 (svo_aero.zip) and my sketch of the coordinate frames.
Hi @szebedy Thanks for your help. I'm trying to use the Kalibr to do the calibration. I will share the result to you when I finish.
Hi @szebedy sorry to disturb you.
Does the T_B_C here mean the transformation matrix between the bottom camera and IMU?
Yes it does
Hi@lbegani
Thanks for your reply.
How is the performance of the VINS-mono?
@zxp771 The performance(in Aero) in terms of pose estimation accuracy looks good. However in terms of processing time taken, its not looking good. I have disabled loop closure and need to do few more optimizations to bring down the latency. If you are looking for performance metrics, there is a paper on benchmark comparison between different open source VIO algos - http://rpg.ifi.uzh.ch/docs/ICRA18_Delmerico.pdf
Hi @lbegani Thanks for your reply and sharing. I have read this paper before.
Hi everyone,
I was reading this thread and could not get a conclusion if the aero platform allowed the capturing of measurements of IMU and the Aero cameras in synch? Has anyone verified how well the synch between these measurements is?
Also, can we synch the IMU measurements with the camera measurements from any of the 3 cameras in the Aero (real sense, 8MP front facing and VGA bottom facing)?
Hi, does anyone know how to capture synced imu and camera data from aero?
@moshanATucsd Hi, As I know the imu and camera data on the drone need to do some post-processing to get the synced data. The Drone won't give us these directly.
Hi @araujokth Unfortunately, the drone won't give us the synced data directly. We only can do post-processing with these data.
Hi @szebedy sorry to disturb. Have you saved some dataset collected by the Intel drone? Cloud you share the dataset with me? Thanks so much
Hi @zxp771,
Unfortunately I don't have any of the datasets anymore. But you can check out my repository written for the Intel drone under this link. It might have some useful information about visual inertial odometry. In the end I achieved the best results by simply not using the inertial data for odometry, only the camera stream.
Hi @szebedy Thanks for your sharing! It's really helpful!
I‘m wondering how can I get the GPS data and IMU date from the drone. I want use the drone to do some development and research in visual inertial odometry/slam. These data is very important and it better to get these data and the video data in time synchronization condition.