eleurent / rl-agents

Implementations of Reinforcement Learning and Planning algorithms
MIT License
553 stars 149 forks source link

Caching of previous_state non-compatible with Multi-Agent #80

Open TibiGG opened 2 years ago

TibiGG commented 2 years ago

I discovered while digging through the code that a certain state value called previous_state of the DQN algorithm (and possibly some others) is being cached on the act() and action_distribution methods of the class.

From the little digging that I did, it seems to be related to the side-panel of the rendering, which showcases extra information about the attention heads of the controller vehicles.

Only that, when there are more than one controller vehicles, it seems to be redefined n+1 times, where n is the number of vehicles, during each act() call: once as the tuple of observations of all agents, and once as the observation of each agent, until it gets redefined as the observation of the last controlled vehicle.

Snippet from rl_agents/agents/deep_q_network/abstract.py:

    def act(self, state, step_exploration_time=True):
        """
            Act according to the state-action value model and an exploration policy
        :param state: current state
        :param step_exploration_time: step the exploration schedule
        :return: an action
        """
        self.previous_state = state    #<==========HERE=============
        if step_exploration_time:
            self.exploration_policy.step_time()
        # Handle multi-agent observations
        # TODO: it would be more efficient to forward a batch of states
        if isinstance(state, tuple):
            return tuple(self.act(agent_state, step_exploration_time=False) for agent_state in state)

        # Single-agent setting
        values = self.get_state_action_values(state)
        self.exploration_policy.update(values)
        return self.exploration_policy.sample()

It does not seem like the most pressing issue, but I am just putting it here, in case anyone has a decent idea on how to deal with this. Or for a clearer explanation as to why this variable is important, as I only gave one example of its usefulness.

Thanks!

eleurent commented 2 years ago

Yes, everything you said is absolutely correct