team4909 / 2019-Deep-Space

Team 4909's 2019 Robot Code
MIT License
2 stars 1 forks source link

Test PixyCAM for Alignment #153

Closed roshanr10 closed 5 years ago

roshanr10 commented 5 years ago

I'll be putting together this today and handing off to Ashwin tonight to experiment with using this for alignment. I think he's trying vision for line following.

Limelight is also backordered like anything, so I don't think we'll get much time to test that for DCMP even if we obtain it. Mr. Flood is pushing hard for it so we'll see.

ashwinc12 commented 5 years ago

Posted most of my comments in software but I'll post again here and add some more stuff

ashwinc12 commented 5 years ago

PixyCam update: I was able to get the PixyCam and Arduino connected to the rio via usb serial port and was able to send and receive data both ways. All of the code can be found in the pixy-test branch and I'll post the arduino code that I wrote as well in here. The original plan was to have the rio call the arduino after the a command was called. Then, depending on the situation, if the robot veered to the left, right, or stayed in the middle, the arduino would write a different number of bytes per case to the rio in which it would move the drivetrain left, right, or stay the same. The only problem that I've faced so far is that the buffer for the arduino continuously fills up to 4096 even after Serial.read(). If we can get around that problem, and are able to send the correct amount of bytes, then we can move on from there.

ashwinc12 commented 5 years ago

unnamed

ashwinc12 commented 5 years ago

The way the correction works is by using the tip and tail of the vector that is given by the pixycam getMainFeatures() method. The tip has coordinates (x1,y1) and the tail has coordinates (x0, y0). In the picture above, is an example of if the robot strayed too far right. The difference from the center to x1 is greater than the distance from the center of x0, therefore we must correct left. Vice versa for if we drift to the right

ashwinc12 commented 5 years ago

After getting the sending bytes problem figured out, all that's left to do is write the remaining couple lines of code on the rio side and then test with gaffers tape and frankie and the actual robot.

ashwinc12 commented 5 years ago

Something that should get changed later if we do end up going with pixycam, is doing a threshold of error when doing the vector math as described above. Right now the Pixy is very sensitive to corrections so the robot would continuously jerk itself. A tested threshold of error would fix this problem.

ashwinc12 commented 5 years ago

Was able to get communication between the rio and the arduino and got the robot to line follow. I believe this proof-of concept was a success.

roshanr10 commented 5 years ago

Love it, so what else needs to occur for this to go on the robot? Make a checklist here and put up a WIP PR.

On Wed, Mar 20, 2019 at 10:05 Ashwin Chitoor notifications@github.com wrote:

Was able to get communication between the rio and the arduino and got the robot to line follow. I believe this proof-of concept is a success.

— You are receiving this because you authored the thread.

Reply to this email directly, view it on GitHub https://github.com/FRCteam4909/2019-Deep-Space/issues/153#issuecomment-474843761, or mute the thread https://github.com/notifications/unsubscribe-auth/AEUIRxkIFH9tcMYJdkc7GWyhhnc6tZRpks5vYkAhgaJpZM4b6m3K .