Closed LoyVanBeek closed 7 years ago
How can referees know that the robot understand the language? Can we ask the robot to describe the actions it plans to show that it really understand the language?
For GUI, it's natural to use apps in iPad to send the task.
There are many ways:
There are several steps to 'getting a command':
The QR codes and typing are a way around 1. Having a GUI with icons to indicate your command can help with doing step 2, up to some level I think.
Splitting the score for getting a command according to these steps would be good.
Agree with Loy.
I don't think teams should score for performing ASR since what we want is robots solving the task, so if the robot can't understand what has been told, is irrelevant if the command was recognized.
Exception is the Person & Speech recognition, but this has been solved already in Pull Request #266 Since Continue Rule already provides a workaround for the whole command input and GUI (now allowed) are by far more friendly than QR codes, we can consider this issue as solved.
Understanding a command now consists of 2 parts: speech recognition and language understanding. If the former doesn't work, you do not get the chance to to the latter, which often happens in RoboCup@Home due to the noisy venue. It can work however, but can also be quite frustrating for the team and the referee (speaking from experience here).
Splitting these points up would help a lot of teams. Also, in the indended application (care) some potential users of robots may have a peech impediment. Standard speech might not work for those people and thus an alternative could be useful.
This will always be the procedure, no exceptions.