Mercury1089 / 2016-robot-code

Team Mercury 2016 FIRST Robotics season code for FIRST Stronghold
0 stars 0 forks source link

Highlight targeted goal on SmartDashboard camera display #3

Closed DrewTheRat closed 8 years ago

gartaud commented 8 years ago

note: if this cannot done directly in the smart dashboard, this might require to stream the edited video with the cross-hair burned-in back from grip

gartaud commented 8 years ago

see also #1

rpatel3001 commented 8 years ago

can be accomplished either with a transparent .png added to the smartdash and dragged to the correct position, or drawing rectangles on the image before it's sent over the network using NIVision or OpenCV

DrewTheRat commented 8 years ago

Transparent .PNG is definitely the way to go. That will be really easy.

gartaud commented 8 years ago

So only a static crosshair? I was thinking of the target highlighted at the same time... so that we can easily align it to the crosshair (i.e. use the info from grip to draw the bounding boxes of the main target) Raj, how does NIVision compare to GRIP (which we currently use to easily access OpenCV's features)?

gartaud commented 8 years ago

Drew thinking about it the static crosshair might be confusing because the center of the image is not necessarily where we want the target to be, is it? (that depends on where the camera is located, and the angle and speed at which the boulder is ejected...)

rpatel3001 commented 8 years ago

GRIP is new this year so I'm not familiar with it, but if it breaks out OpenCV's drawing functionality, I'd use it just to have everything in one pipeline. I tried NIVision a little but it felt a little unwieldy. The opencv docs are also really helpful, with the caveat that they're in C++ and the pointers and such are wrapped in Java objects and the conversion can be unintuitive. Shouldn't be a problem since it's all packaged into GRIP though.

rpatel3001 commented 8 years ago

Also, NIVision runs on the roboRIO, while the openCV work that I did was part of a smartdash widget on the driver station laptop, after the image was already transmitted over the network.

gartaud commented 8 years ago

Ok. We need to investigate GRIP further. I don't know that it will let us draw on the images but if it could we would just have to use the contours as an input to a step that draw rectangles. Of course that would not allow us to change the color of the rectangles based on our shooting criteria (since they are computed in the robot code) but maybe that is good enough combined with the gamepad feedback? (it seems that we can at least have a step so that GRIP behaves as a streaming device so I suppose that it is possible to display the output from GRIP in the smart dashboard).

DrewTheRat commented 8 years ago

I don't think the cross-hairs give us much, really. I just thought it would look cooler. I think for actual game play we need to rely on our targeting logic and telling the drive team when to shoot...

gartaud commented 8 years ago

OK. I played a little with GRIP (you can download it by itself and use a webcam as input). It is a cool concept but like the other FRC tools it lacks proper documentation. I have most steps in place to display the contour of the targets in the smart display... but not all... so it might be a dead end.

gartaud commented 8 years ago

Workaround is to download GRIP SmartDashboard Extension from https://github.com/WPIRoboticsProjects/GRIP-SmartDashboard/releases

DrewTheRat commented 8 years ago

Okay, now I am realizing what you are envisioning with this.I have renamed the issue to account for that. This would be good.

gartaud commented 8 years ago

Thanks. Ideally we would not only highlight the targeted goal but also provide indication that it is ready to be shot. I don't know however how we could provide the indication as this would require overlaying something on the video stream dynamically. The GRIP SmartDashboard extension will not allow showing that the goal is ready to be shot in its current implementation and not even show which target is the one the robot is aiming at if multiple targets are found. I have therefore created a request for enhancement so that the target with the largest area be highlighted more than the others at https://github.com/WPIRoboticsProjects/GRIP-SmartDashboard/issues (as we always consider the target with the larget area to be the primary target)

rpatel3001 commented 8 years ago

Not sure if the source for the GRIP smartdash plugin is released for you to modify, but if not, creating a custom widget is fairly simple. The one we made in 2014 is on github.

gartaud commented 8 years ago

The source is available at https://github.com/WPIRoboticsProjects/GRIP-SmartDashboard/tree/master/src/main/java/edu/wpi/grip/smartdashboard. You are right that we could customize it... I will look at the 2014 project.

gartaud commented 8 years ago

OK, nice widget. So in 2014 the autonomous mode did not have access to target info (as it was computed on the PC rather than the robot itself)?

rpatel3001 commented 8 years ago

All the info was added to the network table, both so it could be displayed as a separate widget and so it was accessible to the robot.

gartaud commented 8 years ago

Ah very nice! This year we use GRIP to replace what the 2014 widget was doing (i.e. compute what to put in the network tables)... but we are missing the custom rendering to the smart display part. So we can use the GRIP extension as an easy not-too-bad approach or indeed do our own widget (more work, not sure if we will have volunteers for it). Thanks.

DrewTheRat commented 8 years ago

Why not just display a green border around the video panel? We can probably do this by adding a larger green square the video. That would probably take less time to build, and is more visible to the drive team.... I really think we need our display feedback feedback to be as binary as possible. It should either say "yes-shoot now!" or "no-can't shoot." I am worried that trying interpret the video screen could be distracting and slow us down during a match. This is also why I want to rumble the game pad - so there is no confusion on when to take the shot...

gartaud commented 8 years ago

OK

dsam7 commented 8 years ago

good enough, closing