Open sharish33 opened 4 years ago
Hi Martin,
I have trained the network with my own data(formatted in Kitti odometry dataset) without cuda and training was successful for epochs = 500. But training is very slow(each epoch takes 80 secs approx) may be because I ran the training without cuda. I also trained the network for different epochs(10,50,100,400,1000 etc)
After training I did not see the any difference in the results/plots, iekfnets.p. The results looks the same even without any training, and those results are really looks very ugly for real time data.
and also to add, during training the loss is getting randomly increased and decreasing for each epoch. My assumption is after each epoch the loss has to reduce by the optimizer updating weights and biases in network. But In my training loss calculation is being weird and sometimes training loss is too high.
Please can you share your inputs if I am missing something here to use my own real time data.
Best Regards, Harish
Hi @sharish33. Could you please give me some pointers as to how to prepare your own data (stored in a bag file for instance) in the KITTI format ? Other than the IMU topic what other information or topics should the bag contain ?
Hi, the best way would be to get familiar with the kitti odometry format. the below paper will be really useful: https://www.mrt.kit.edu/z/publ/download/2013/GeigerAl2013IJRR.pdf
Below is the format of data you need to prepare from your bag file.
Kitti Odom format: lat: latitude of the oxts-unit (deg) lon: longitude of the oxts-unit (deg) alt: altitude of the oxts-unit (m) roll: roll angle (rad), 0 = level, positive = left side up, range: -pi .. +pi pitch: pitch angle (rad), 0 = level, positive = front down, range: -pi/2 .. +pi/2 yaw: heading (rad), 0 = east, positive = counter clockwise, range: -pi .. +pi vn: velocity towards north (m/s) ve: velocity towards east (m/s) vf: forward velocity, i.e. parallel to earth-surface (m/s) vl: leftward velocity, i.e. parallel to earth-surface (m/s) vu: upward velocity, i.e. perpendicular to earth-surface (m/s) ax: acceleration in x, i.e. in direction of vehicle front (m/s^2) ay: acceleration in y, i.e. in direction of vehicle left (m/s^2) ay: acceleration in z, i.e. in direction of vehicle top (m/s^2) af: forward acceleration (m/s^2) al: leftward acceleration (m/s^2) au: upward acceleration (m/s^2) wx: angular rate around x (rad/s) wy: angular rate around y (rad/s) wz: angular rate around z (rad/s) wf: angular rate around forward axis (rad/s) wl: angular rate around leftward axis (rad/s) wu: angular rate around upward axis (rad/s) pos_accuracy: velocity accuracy (north/east in m) vel_accuracy: velocity accuracy (north/east in m/s) navstat: navigation status (see navstat_to_string) numsats: number of satellites tracked by primary GPS receiver posmode: position mode of primary GPS receiver (see gps_mode_to_string) velmode: velocity mode of primary GPS receiver (see gps_mode_to_string) orimode: orientation mode of primary GPS receiver (see gps_mode_to_string)
Best Regards, Harish
Hi @sharish33 , I think the initial parameters play a big role in the result, like initial error covariance, process noise covariance and measurement covariance. I wonder if you still used KITTIParameters or your own parameters when you trained successfully. If you used your own parameters, could you please give me some advice on how to set up these parameters? Through experience, IMU specification, or other methods? Thank you!
Hi @sharish33 , I think the initial parameters play a big role in the result, like initial error covariance, process noise covariance and measurement covariance. I wonder if you still used KITTIParameters or your own parameters when you trained successfully. If you used your own parameters, could you please give me some advice on how to set up these parameters? Through experience, IMU specification, or other methods? Thank you!
i agree with you, i also think the initial error covariance, process noise covariance and measurement covariance should get from your IMU
@sharish33 Have you trained the network with Kitti dataset without cuda? When I ran the training without cuda , Why the loss_train.backward() token about 434 secs just for one epoch? CPU is AMD Ryzen 7 5800H with Radeon Graphics and GPU is Nvida GeForce RTX3070 Laptop GPU
yeah, i had trained without cuda, i trained one epoch had token about 400 secs the same to u
------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2021年7月8日(星期四) 上午9:16 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [mbrossar/ai-imu-dr] Training with own data (#44)
@sharish33 Have you trained the network with Kitti dataset without cuda? When I ran the training without cuda , Why the loss_train.backward() token about 434 secs just for one epoch? CPU is AMD Ryzen 7 5800H with Radeon Graphics and GPU is Nvida GeForce RTX3070 Laptop GPU
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
Question for Martin and Harish: About the Kitti Odometry data format, my data is for a moving vehicle and has only 3 accelerations, forward, left and up and 3 corresponding gyroscope values. I did find that Brossard's code uses only one set of accel and gyros and one set of velocoties. Do I take it that RPY (roll, pitch and yaw) and velocities GPS results since Brossard's paper only mentions accel and gyro as IMU quantities?
Can anyone tell me out GPS outage is modeled insofar as the test pickle files are concerned? Specifically, Brossard's pickle files are made of u (accel+gyro), and quantities like p_gt depend on lat/lon , v_gt is velocity Rot_gt are rotation matrices depending on roll, pitch and yaw. How do you work a test case where you only have u? ie. u and no GPS? For KITTI files, the standard is to replace the last 3 integer numbers with -1, but Brossard's code ignores such numbers. I know the data is divided into segments e.g. first 9 for training and a 10th for testing but what is the GPS content of that 10th segment if it represents the data after GPS outage? Is it just a matter of setting the GPS variables to zero during outage? and counting your epochs?
Can anyone tell me out GPS outage is modeled insofar as the test pickle files are concerned? Specifically, Brossard's pickle files are made of u (accel+gyro), and quantities like p_gt depend on lat/lon , v_gt is velocity Rot_gt are rotation matrices depending on roll, pitch and yaw. How do you work a test case where you only have u? ie. u and no GPS? For KITTI files, the standard is to replace the last 3 integer numbers with -1, but Brossard's code ignores such numbers. I know the data is divided into segments e.g. first 9 for training and a 10th for testing but what is the GPS content of that 10th segment if it represents the data after GPS outage? Is it just a matter of setting the GPS variables to zero during outage? and counting your epochs?
Hello,Scott. I have a problem as same as you.How does this code work with only IMU data?Do you solve it?
Hello @1248280302 I had to add the extra variables as indicated in KITTI format to transform them into (IMU accel+gyro) called u , and also (velocity) v_gt, (displacement) p_gt and Ang_gt where 'gt' stands for ground truth. I used interpolation to put all quantities on the same frequency. I transformed the GPS lat/long/alt into NED coordinates to get p_gt and from gradients of p_gt wrt time of the latter w.r.t. got the velocities v_gt. The hard part is getting RPY. I used MATLAB namely ecompass if I have magnetic data or IMU sensor fusion using IMU only to get RPY to get ang_gt. Bear in mind, the program only needs v_gt and ang_gt at initial time t=0. p_gt at t=0, is set/normalized at zero. So it is true that you need mostly IMU accel + gyro for a test run. But you need initial v_gt and Ang_gt. Much specific information is in main_kitti.py which shows how the data is input from OXTS format and transformed to Brossard's variables in pickle format. Be careful about the convention. It seems to me mostly in ZYX standard for Euler angles. gravity is up in z-axis, forward is "x" axis and left is "y"-axis. p_gt is a bit special (see the comments in the code). The program is set to compare ground truth with integrated IMU. So the GPS stuff is needed for t>0 when the comparison is made at the end of the test in the results section.
I thought RPY would be in ENU. However, my most successful run with Brossard's program tells me it is in NED according to MATLAB's imu filter, certainly for Yaw and Roll (and I gather pitch too). Can anyone confirm or deny?
I managed to get my own data to produce a good result from Brossard's program. I had to first chase of the drift in my IMU data using my own version of MATLAB's detrend and examining low speeds (from GPS). That was not enough, I had to change the noise convolution parameters (variables cov() where () can be 'omega', 'acc' etc.. ) ie. actually increase them to get a good result. I would appreciate any feedback from anyone on this matter: how can these con() variables be determined from the IMU data? from the training process? I count 14 such parameters in main_kitti.py and also utils_numpyfilter.py (in which 5 are updated during a test run). I get the impression that Brossard choice of values for these cov() values were found for the KITTI datasets taken with the (precise) OXTS sensor. I am not using this data (apart of course, of running Brossard's test cases). I suspect these parameters are sensor dependent.
Can anyone tell my why there are instructions for setting the noise covariance parameters in main_kitti.py as well as utils_numpy_filter.py? The values for these settings also seem to be different.
Hi Martin,
Hope you are doing good. I am willing to train the neural network with my own data sequence(prepared data sequences in kitti format) and ended up in some errors.Please can you give your inputs on below :
I have installed torch version 1.4.0 but I did not have Cuda installed in my laptop.Is cuda is mandatory to run the training?
Best Regards, Harish