We need a way to link conversation items (e.g. "kitchen island", "coffee table") into navigation points for the robot to move to (ideally which are advantageous for manipulation). I suggest we use the format that a semantic map would be in, therefore in the future we wont need to change the nlp or controller to get it to work with a proper sem map.
The impact of this proposal is:
The nlp module has to translate the high level symbols (and synonyms) into tags found in the sem map, and provide the tags as part of the response to the controller's service request
The controller will read the sem map file and store locations for the robot to go to to be considered "at" a piece of furniture
We need a way to link conversation items (e.g. "kitchen island", "coffee table") into navigation points for the robot to move to (ideally which are advantageous for manipulation). I suggest we use the format that a semantic map would be in, therefore in the future we wont need to change the nlp or controller to get it to work with a proper sem map.
The impact of this proposal is: