Open trancenoid opened 2 years ago
Found a way :
Add the sensors to the Simulator Config,
call env.make()
call env.record_manually(output_dir)
I noticed some problem with this method. I am able to get the Observation (All segmentation, RGB views along with multimodal values), but there is no way to get the actions that I inputted to the simulator while recording my run. I looked into the code but because the _observe function pulls only the sensor reading, inputs given directly to the simulator using the keyboard is not captured. Is there a way to record actions too in record_manually function ?
Is there any way to play (human input using keyboard) the game and record the RGB, Segmentation, Actions and Rewards to augment the demonstration dataset ?