Closed hp2500 closed 7 years ago
Check out human_action.py - we created it for this very purpose. It allows a human player to play Minecraft in the same way that a Malmo agent does - ie it turns mouse/keyboard events into Malmo commands, which are sent in the usual way, and can be recorded just as normal.
It's not possible to record the events that occur when a user plays Minecraft "directly" (eg by hitting enter to toggle to human input mode), since the action space is very different from that which is available to the Malmo agent. There's no Malmo analog for waggling the mouse, for example, so there would be no way for an agent to learn from that data.
If you want the more natural gameplay that comes from playing directly, you could, of course, record the observations from the human player, rather than the commands. That would allow you to track the yaw, pitch, and position of the player, and you could potentially reconstruct a set of commands from that data. In theory, you could even use the AbsoluteMovementCommand
handler, which provides commands for setting the yaw/pitch/position to absolute values, and then it would be a simple job to turn the observations into commands - though that would be a very verbose way of doing things.
I hope that helps - I'll close this issue, but feel free to reopen it if you have further questions about this.
Thank you Davey, that is pretty much what i have done so far. One of my problems with human_action.py is that the gameplay feels like its lagging, even on a very fast computer.
Hey there, I'm interested in creating tasks that can be solved by human players as well as AI. I know that i can observe the commands that are given to an AI agent.
Now the question: Is it also possible to observe the "commands" a human player produces using keyboard and mouse while playing the game? I am trying to export all sorts of log data and the human player commands is the only thing thats still misssing. I would be very grateful for any hints and ideas.
Best, Henry