a1k0n / cycloid

self-racing car platform
MIT License
181 stars 22 forks source link

Not an issue just a question #2

Open sidtalia opened 5 years ago

sidtalia commented 5 years ago

I was reading the localize.cc file and it appears that you are shifting the position of (what I think are ) trajectory points relative to the car, such that the car always stays at 0,0. Is my understanding correct? Also could you maybe give an explanation if my understanding is not correct as to what the car is doing? Thanks.

a1k0n commented 5 years ago

No, that's not what it's doing... can you point to the line numbers you're looking at?

localize.cc is an implementation of Monte Carlo localization AKA a particle filter. It has a bunch of possible locations that the car is in, and each time it senses motion from the wheel encoders / gyro it moves the particles forward with some noise, and each time it detects cones it resamples the particles based on which ones are likely to have seen the cones it saw.

sidtalia commented 5 years ago

Oooh. Okay. I think I got it. The reason why it felt that way was because I saw particles[i].stuff and I thought you were bundle adjusting the points every time the car moved (which seemed rather computationally expensive).

One more thing, I find your project interesting, can I keep asking questions? I made something similar last year(and actually the year before that too, the one I made last year just had better hardware. The code was more or less the same. I am working on a new version of it right now and would love to discuss things ) https://www.youtube.com/watch?v=IqyNvaG0JE0 github link : https://github.com/naughtyStark/Self-driving-car . In my case the localisation wasn't using computer vision ( like in your case ) because it is supposed to be a more "general purpose" car, i.e., it is supposed to be able to work in unstructured environments (unlike the diy-robocar tracks ) and so visual markers are not a luxury I have at hand.

Thanks, Regards Sidharth.

a1k0n commented 5 years ago

Not a bundle adjustment or anything like that, but yes all the points are updated at 30Hz. Not a big deal for a raspberry pi to update 300x3 floats.

Nice work! Looks like your localization used IMU integration + GPS + optical flow sensor? I've been meaning to try out an optical flow sensor, and it's awesome to see you got one to work. One thing I can't measure very easily is lateral velocity if the car is in a drift.

Sure, keep asking questions.

sidtalia commented 5 years ago

Okay, I'll keep this thread alive then.

Yes I do notice the same problem of lateral drift with ArduRover as well, which is kind of what inspired me to make my project in the first place. You might also notice that it has some trajectory planning. I was trying to read your driver code but I couldn't quite figure out some parts. Here is how my car works 1)get state 2) find the bezier curve parameters of the bezier curve that joins current position to next waypoint (the details are on the github page) 3)find the ROC of the point where the car will be 2 servo/esc cycles into the future (2 because the signal is sent out at the end of each servo cycle and it would take at least one cycle to implement things) 4) find steering angle for that ROC (compensate for yaw rate errors too) and calculate throttle on the basis of target velocity (calculated from ROC) and how much total acceleration the car is undergoing.

In your code, I can't understand how you're getting the ROC/ curvature of a point (which would be step 3 in my case)

Thanks, Regards, Sidharth

a1k0n commented 5 years ago

Ah. Yes, the code in here is a disaster with no clear design, just a bunch of experiments strung together.

I do something similar to you, but instead of using bezier curves I use evenly-spaced points with precomputed curvatures (1/radius). The line to drive is computed with an offline optimization step I run in a Jupyter notebook, and then just upload all the waypoints as a text file which the car reads as track.txt. The code to get the nearest waypoint is here: https://github.com/a1k0n/cycloid/blob/master/src/drive/trajtrack.cc.

So localization is just determining my x, y, theta, and then I find the closest point on the curve, and use the control strategy outlined at the end of my blog post here to determine the curvature to drive, given the relative position / angle. I compute the control for every one of my particles and take the mean.

sidtalia commented 5 years ago

Okay (the blog post was a good read!).

In the trajtrack.cc file, I saw something like a "look-ahead" functionality. From what I can understand, the car looks a certain number of points ahead of the closest X and Y coordinates. However, in the blogpost you mention that the idea is to look for the maximum kappa ahead of the current position to determine the target velocity. (also I don't see lookahead_k being used anywhere in the controller.cc file). Did I misinterpret the code somewhere?

If my understanding is somewhat correct and the look ahead hasn't been implemented completely, I would like to contribute. I will make the changes in my fork and create a pull request when it is ready(it might take a while considering my exams are going on right now and will end in by the end of December).

Thanks, Regards, Sidharth

a1k0n commented 5 years ago

lookahead is used but its member variable is confusingly named vk_. The TrajectoryTracker::GetTarget method takes a lookahead parameter and outputs a κ for velocity lookahead (hence vk_).

sidtalia commented 5 years ago

Okay. I noticed that the lookahead distance was fixed. I tried making it dynamic. I have created a pull request for it.

Thanks, Regards, Sidharth.