Open jjqcat opened 4 years ago
Hi jjqcat, I found this project recently and would like to see this in action. To follow the development, it would be beneficial if you could post in English. Thanks in advance for the extra effort.
thanks. This project has worked well in action in our navigation app, all the sensor required are list in the readme, platform such as ios(IPhone5s, IPhone6s, IPhone7) and android(HUAWEI, Honor, XiaoMi, VIVO, OPPO, SAMSUNG) had passed our road test and you don't need to buy any other sensors.
In the current realease, we did the following:
1.According to different mobile phone platforms, the sensor data are extracted and converted into the input required by the system. 2.The strapdown inertial navigation theory is adopted as the system foundation. Using the Kalman filter and the IIR-LPF to correct the GPS and sensor data. 3.Optimizing for various practical scenarios.
Next release, we will do the following:
1.Weak GPS can be used as a reference to some extent. 2.Solve the comparison between directional fluctuation and off-course using Energy Statistics(may be). 3.Solve the problem of inaccurate direction of low-speed GPS.
Great work! how can I get sensor data from Huawei phone ? thanks.
see the android/ios official docs: https://developer.android.com/guide/topics/sensors/sensors_overview https://developer.apple.com/documentation/coremotion
Thanks,I' m looking forward to the next version.
Ohh, I have compiled this project after change source file encoded to UNICODE, and test successfully.
if used actual environment, Sensor must be calibrated first?
Emm, all the sensors must be calibrated before pass it to the algorithm, I will post the calibration code in the future and actually you can find many papers talk about this.
The project is very referential and the code annotation is complete. how to get the sensor data required by the project with mobile phone? Or what devices do you need to buy to collect sensor data?