NUbots / robocup

The NUbot's RoboCup Code
GNU General Public License v3.0
34 stars 17 forks source link

Robots track ball incorrectly #38

Closed mmetcalfe closed 11 years ago

mmetcalfe commented 11 years ago

The robots tend to keep the ball at the very edge of their view when they're trying to look at it. (This is thought to be caused by a change in the way that localization reports the ball position)

The robot should keep the ball centered horizontally within it's view when it is attempting to track it.

josiahw commented 11 years ago

I think this is assigned in the head jobs.

StevenNicklin commented 11 years ago

The tracking shouldn't have anything to do with the localisation. In the past the head tracking used a variable in the field objects that contained the raw angle from the center of the screen to the center of the ball in image coordinates set by vision: ScreenXTheta() and ScreenYTheta(). The tracking part aimed to get this value to zero i.e. the ball center in the center of the image.

It was done this way to get the responsiveness necessary to track the ball when it is moving quickly, since the filtered ball is not going to always move quickly enough when there is a sudden change. I think there might have also been an optional offset added too so that you could look at the ball in the lower half of the image and still see other things.

You should probably check which values are used by the head tracking to make sure it is the right ones and then see if it is getting set in the vision correctly.

shannonfenn commented 11 years ago

As far as the way Vision is reporting it I'll check that while making the changes necessary for the new distance to point calcs.

Jade-Fountain commented 11 years ago

Had a quick dig around for this. I think it is probably whatever Vector2(m_location.angular.x, m_location.angular.y) is in ball.cpp. about line 65. This is what screenThetaX() screenThetaY() evaluate to.

WongAaronSW commented 11 years ago

If this helps: Old Calculation for ScreenXThetadouble Vision::CalculateBearing(double cx){ double FOVx = deg2rad(46.40f); return atan( (currentImage->getWidth()/2-cx) / ( (currentImage->getWidth()/2) / (tan(FOVx/2.0)) ) );}

Old Calculation for ScreenYThetadouble Vision::CalculateElevation(double cy){ double FOVy = deg2rad(34.80f);return atan( (currentImage->getHeight()/2-cy) / ( (currentImage->getHeight()/2) / (tan(FOVy/2.0)) ) );} Regards, Aaron Wong

Date: Wed, 5 Jun 2013 23:58:54 -0700 From: notifications@github.com To: robocup@noreply.github.com Subject: Re: [robocup] Robots track ball incorrectly (#38)

Had a quick dig around for this. I think it is probably whatever Vector2(m_location.angular.x, m_location.angular.y) is in ball.cpp. about line 65. This is what screenThetaX() screenThetaY() evaluate to.

— Reply to this email directly or view it on GitHub.

shannonfenn commented 11 years ago

Ok so the issue is that in the develop branch there is still the inclusion of the head angles into the camera bearing and elevation, this has already been removed in the branch I am working on so I'll close the issue for now.

shannonfenn commented 11 years ago

To clarify so everyone knows what the issue was:

There are multiple values associated with a field object that vision reports including screen location and size, camera relative bearing and elevation, neck relative 3D position (radial) and a few variances (not used).

As Steve said the robot uses the camera relative angular position to quickly move the ball to the centre of the image and this value (and the screen values) aren't changed by localisation at all, so if there are problems with these it is Vision's fault (I'm not sure if this is the same for the neck relative). As far as the rest of the values used by localisation go I'm not so sure.