FRC5549Robotics / 5549-2020

FRC Team 5549's 2019-2020 Robot Code
2 stars 2 forks source link

Finding Distance using Limelight #2

Closed dylantknguyen closed 4 years ago

dylantknguyen commented 4 years ago

Fine tune contour and object detection of the reflective tape for Limelight.

JimTheScientist commented 4 years ago
  1. Peak - when we are getting the ideal RPM measurements, we need to only record the rpm that is given when the peak of the height of the ball does into the target.
  2. Distance to RPM - real vs fake. When we are gathering RPM data, we need to use the real Feet measurements, but when putting them into the Regression, we need to use the Robot's. (Because the robot's dist is what is going to be used in the algorithm.
JimTheScientist commented 4 years ago

I am by far not a pro at doing math in python with wpilib, but this would be the code I believe

import math #important because you need to do math

self.angle =  NetworkTables.getTable("limelight").getNumber('ty') #should get the angle from the limelight
self.angleTwo = self.angle + 45 #plus 45 because that is what it will be mounted at I believe
self.distance = 96.19/(math.tan(math.radians(self.angleTwo))) #the math to calculate distance in python, also the 96.19 is from the real target height and about where the limelight is mounted on the CAD file.

self.dash.putString ("The distance away from the target is:", self.distance,) #writes to dashboard what the distance is
tulser commented 4 years ago

So here are my thoughts in writing.

This means bringing the target's horizontal offset towards the center of the screen. We can do this given the vision objective's offset on screen and do PID as a quick solution to get the objective to center or we can do some math. For the math, we only need to find the center of the contour or it's bounds from the screen in addition to the camera pixels and it's FOV so that:

ax = (FOV/largestdim*xcenter)-(FOV/2) [largestdim is the value of the largest dimension size on the camera.]

will produce a good approximate measurement of the difference in alignment between it and the target when the robot is fairly far enough; however, some repeated image processing and application of turns will be needed in order to make the measurement solid. While I know the Limelight will handle the mathematics, this solution will be here in the contingency that it manages to fail.

Constants we will need to have for doing the calculations for this will require knowing the angle of the shooter mount and it's position on the robot. The dependent variable that must be found is the velocity. We must know the distance from the robot to the target.

@VIDEOBOSS's solution is the mostly correct process to find distance (albeit I am concerned with the seemingly inconsiderate use of self. in his variables.) but I noticed some issues with it so I'll present the math. Firstly, we will use about the same eqn above to find the angle; however, the angle will be found from the upper bound of the contour instead of the center since the upper bound lies at the center of the high receptacle. Using the same eqn except for values of y:

ay = (FOV/largestdim*ycenter)-(FOV/2)+aymount

This gets the proper angle to the receptacle now. We then apply the angle in the following eqn series to get distance:

d = h/sin(ay)cos(ay) = hcot(ay) = h/tan(ay)

Note that h should be the height of the receptacle center minus the shooter mount height. and d is the distance on the floor plane from the camera to the receptacle. The actual distance must now be put to the position of the shooter so that.

newd = d + mountdist

mountdist should be the displacement from the camera that I'm assuming it will be a positive value when it it positioned behind the camera.

dylantknguyen commented 4 years ago

Implemented but not tested in 9ffa21a

dylantknguyen commented 4 years ago

This has been tested in 3f29312 and is confirmed to be working. All of the values have been tuned