Closed jabrena closed 8 years ago
The lines you have citied are actually Lauro's mistake. He's robots obviously do it correctly.
Those are correct:
DISPLACEMENT = (LEFT_ENCODER_COUNT + RIGHT_ENCODER_COUNT) * ENCODER_SCALE_FACTOR / 2
POS_X = POS_X + DISPLACEMENT* cos(HEADING)
POS_Y = POS_Y + DISPLACEMENT * sin(HEADING)
I am doing it around every 30 ms (~ 30 Hz). The robot travels at most 0.6 cm during this time. The more frequently the better. For the HEADING I actually take the average of heading at 0 ms and 30 ms. We could get even better estimate because we have first and second derivative (angular speed and acceleration) at both time points from the gyro so we could "integrate displacement along the arc".
Do you think that EV3Gyro is OK to calculate a Pose?
First - this depends on what you are going to use the pose for. In say EKF SLAM pose estimate is only used as initial (prior) estimate, then you get the final (posterior) estimate adding the information from more accurate laser (or other sensor). There are successful SLAM implementations from odometry + laser only (no gyroscope).
If you want to estimate the pose only based on odometry + gyro, sincerely - I have no idea.
Lauro even compares EV3Gyro and CruizCore here:
http://www.robotnav.com/gyroscopes/
As for what I think - I kind of expect to get EV3Gyro from my woman for birthday or christmas... ;-)
I could install both on the robot and let it do 2 maps at the same time (like in the video but split screen) This would be only a few ugly copy and paste in my code. Either next week or during christmas. We would have visual comparison and this is interesting experiment on its own.
CruizCore calculates bias drift during reset and uses internally Kalman Filter. I don't know anything about internals of EV3Gyro but maybe its performance could be improved in the software layer (by say own Kalman Filter implementation/bias drift calculation).
I am 99.9% certain that CruizCore was actually used in successful SLAM implementation, almost sure that CruizCore was used in Samsung Navibot. I have this robot and it works great. I would have to dissasemble it to be sure though.
You could also ask Lauro. I believe he has tried both devices in his stunning drawing robot. What he does in principle is follow the path (arcs, whatever). So he is estimating the pose from odometry + gyro.
You would have to mail him. He's the admin of robotnav.com.
What is the value for ENCODER_SCALE_FACTOR?
What is the calculus for HEADING?
What is the value for ENCODER_SCALE_FACTOR?
You have most of that at Lauro's page:
http://www.robotnav.com/position-estimation/
but it's simply the distance traveled at single encoder count:
ENCODER_SCALE_FACTOR = PI * WHEEL_DIAMETER / COUNTS_PER_REVOLUTION
you can get WHEEL_DIAMETER from Sariel (great LEGO technic reference btw):
COUNTS_PER_REVOLUTION is the number of motor encoder counts per single motor revolution
In ev3dev it's counts_per_rot motor attriubute and it's 360 for EV3 large motor.
I thought that ev3dev has doubled encoders' resolution but actually it dit not to keep the compatibility with other software and keep sanity. It seems it's 360 counts per rotation everywehre.
What is the calculus for HEADING?
CruizCore internally computes the accumulated angle so you simply read its value.
The same is true about ev3gyro but if you spin too many times the angle will get stuck. You can workaround this in software probably.
See http://www.ev3dev.org/docs/sensors/lego-ev3-gyro-sensor/ third note, probably in all ev3 environments but I can't be sure.
Don't forget to calibrate by setting the appropriate mode when the robot is totally stationary. (so maybe it calculates bias drift after all).
Good morning @bmegli,
This weekend I was working on templates sides to generate some parts from spec.json. This week, I will continue with some examples for regulated motors. Later, I will develop this Navigator: http://www.lejos.org/rcx/api/josx/robotics/RotationNavigator.html http://sourceforge.net/p/lejos/rcx/code/HEAD/tree/trunk/lejos/src/java/classes/josx/robotics/RotationNavigator.java
When I handle a wheeled robot in a simple way, I will try to mix odometry from wheels with inertial data from gyro.
I watched your video and I like so much. Nice work mate
Ok!
Good Luck.
@jabrena I finally got the EV3 Gyro Sensor I need for other project.
As promised here is visual position estimation comparison:
I have also thrown pure odometry to the comparison and noted used motion models.
As for the EV3 Gyro Sensor - it's ok but it's not a match for CruizCore.
The Lego declares it has +-3 degrees accuracy. I am not entirely sure that is true but I would have to investigate further.
EV3 Gyro uses internally ISZ-655 gyroscope
You can get the specs here: http://store.invensense.com/datasheets/invensense/PS-ISZ-0655B.pdf
Some advantage of EV3 gyro is that it works at 1 kHz (10 times faster than CruizCore) but it's totally lost in its error (CruizCore will almost always give more accurate reading despite the fact its reading is off in time).
The same is true about ev3gyro but if you spin too many times the angle will get stuck. You can workaround this in software probably.
It actually doesn't get stuck but overflows (in my own tests). I will update the docs. Anyway - this is not a problem in such application (more than 90 rotations are needed to overflow) and can be fixed in software (resetting the angle every now and then).
Good morning mate,
happy new year!!!
The video is pretty technical, congratulations. I have some EV3 Gyro at home, and I am waiting this sensor to play with you in Gyro side. http://www.mindsensors.com/ev3-and-nxt/15-gyro-multisensitivity-accelerometer-and-compass-for-nxt-or-ev3
I tried to adquire Cruizcore XG1300L but I didn't find a reseller on Europe. Do you live in europe? http://www.minfinity.com/eng/page.php?Main=1&sub=1&tab=5
When I finish this release, I will join with you in this matters: https://github.com/jabrena/ev3dev-lang-java/milestones/0.2.0
How to mix wheel odometry with gyro data?
and I am waiting this sensor to play with you in Gyro side
Quite interesting device with 3 axes gyroscope & stuff so its possible applications are wider than CruizCore (just one axis gyro)
EV3 Gyro & CruizCore integrate angle internally. For AbsIMU you will have to do it manually but that's pretty simple (you have rotational speed so it's just a matter of multiplying by time)
From the built-in compass you could always set initial framework of reference (heading). Nice trick.
Do you live in europe?
Yes.
I tried to adquire Cruizcore XG1300L but I didn't find a reseller on Europe.
I ordered mine directly from South Korea. It takes same time... and you have to add duty fees to the bill. I suppose one can rip off some CruizCore variant from dead Samsung Navibot but this is just my wild guess and it would not be the EV3 ready version.
How to mix wheel odometry with gyro data?
This it the "Motion Models" slide in the movie. I did it on purpose so that you have mathematical model for odometry and gyro enhanced odometry side by side.
You will probably have to implement odometry yourself but it's less than 10 lines of code (really!).
Anyway, this is what we are discussing in this thread.
For odometry, High Level wise you would do as often as possible (e.g. while(true)
) or often enough ;-) the following:
X=0,Y=0,heading=0; //initial values
while(true)
{
GetLeftWheelDisplacementInCM
GetRightWheelDisplacementInCM
GetTotalDisplacementFromLeftAndRightWDInCM
GetAngleChangeFromLeftAndRightWDInDegrees
UpdateHeadingFromAngleChange
UpdateXFromTotalDisplacementAndHeading
UpdateYFromTotalDisplacementAndHeading
}
You would use the formulas from "Motion Models" in the video. The units like cm and degrees are fake here. You can use any unit as long as you are consistent.
For gyroscope enhanced odometry you just replace the line:
GetAngleChangeFromLeftAndRightWDtInDegrees
with
GetAngleChangeFromGyroscope
So basically you estimate robot displacement from encoders and robot angle change from the gyroscope.
Gyroscope Enhanced Odometry:
X=0,Y=0,heading=0; //initial values
while(true)
{
GetLeftWheelDisplacementInCM
GetRightWheelDisplacementInCM
GetTotalDisplacementFromLeftAndRightWDInCM
GetAngleChangeFromGyroscope
UpdateHeadingFromAngleChange
UpdateXFromTotalDisplacementAndHeading
UpdateYFromTotalDisplacementAndHeading
}
For ABS IMU you will have to calculate the angle change yourself. The simplest I believe would be:
Gyroscope Enhanced Odometry with ABSIMU:
X=0,Y=0,heading=0; //initial values
last_loop_time=now();
while(true)
{
GetLeftWheelDisplacementInCM
GetRightWheelDisplacementInCM
GetTotalDisplacementFromLeftAndRightWDInCM
GetAngleRateFromGyroscope
GetAngleChangeFromAngleRateAndTimePassedSinceLastLoopTime
UpdateHeadingFromAngleChange
UpdateXFromTotalDisplacementAndHeading
UpdateYFromTotalDisplacementAndHeading
last_loop_time=now();
}
For ABSImu angle integration you could also make use of user space sensors framework that dlech has implemented recently but better start from something simple
Oh - and one last thing - when updating X and Y it's best to use the angle that was in the middle, I mean old_heading + angle_change/2
and update the heading new_heading=old_heading+angle_change
especially if you have longer loop times.
You are the man! :D
Today, I will try to connect the RPLidar with the USB Adapter directly in order to know if it is possible to receive data: http://www.slamtec.com/en-US/rplidar/index/5
If you have time, check some PDFs here: http://www.slamtec.com/static/media/2014/09/rplidar/rplidar_sdk_v1.4.5.7z rplidar_sdk_v1.4.5\doc\en.US\rplidar_sdk_manual_en.pdf
If you have time, check some PDFs here:
Hardware-Wise:
Yes, it should be possible to use it both with USB adapter and directly (soldering somewhat in line with XV11 Lidar, not exactly)
The USB adapter uses CP2102 chip for USB2UART bridge. The driver should be already included in any modern linux kernel (CP210x).
In fact it's included in ev3dev kernel tree:
ev3dev-kernel\drivers\usb\serial\cp210x.c
So good news - USB2UART bridge used in your adapter should be working out of the box in ev3dev. Anyway, after plugging the device type:
dmesg | tail
to make sure.
Software-Wise later if I have time.
Software-Wise sketch:
When you connect the adapter the USB2UART bridge should be reachable at:
/dev/ttyUSB0
or something along the line
You will have to compile the driver/sample applications from the SDK. You have two basic routes here:
Anyway, follow SDK documentation.
For building on EV3 you will probably need something along the lines:
apt-get install build-essentials
Maybe something more.
You will definitely encounter some problems along the way and get three or four headaches ;-)
But... it's also certainly doable.
Good luck.
Ok, some final remarks and guesses.
When you build SDK you will also end up with static or dynamic library with RPLidar driver API.
You can use this library also from Java.
Ugh, and we should move the RPLidar integration discussion to separate thread.
Hi,
I continue the notes about this issue: https://github.com/ev3dev/ev3dev/issues/441
You use this calculus to calculate a Pose:
Do you think that EV3Gyro is OK to calculate a Pose?
Juan antonio