Unity-Technologies / ml-agents

The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
https://unity.com/products/machine-learning-agents
Other
17.1k stars 4.15k forks source link

StatsRecorder and mlagents_env working together by default #5405

Closed jziegle4 closed 3 years ago

jziegle4 commented 3 years ago

Is your feature request related to a problem? Please describe. When i try to use mlagents_env with my own Python code i would like to use the same Academy.Instance.StatsRecorder to record internal statistics without having to rely on a custom sidechannel. Since this would help to use the same executable with both PPO/SAC that is integrated into mlagents and with custom code using mlagents_env. This would be helpful

Describe the solution you'd like Allow Acaedmy.Instance.StatsRecorder to log values and write them into Tensorboard with a simple bool value when instancing mlagents_env. This would simplify coding on python side, since StatsRecorder seemingly cannot be used with mlagents_env since it is not registered by default.

vincentpierre commented 3 years ago

Hi @jziegle4 I am not sure I understand the request. mlagents_envs does not send any information to TensorBoard on purpose because a lot of users do not need to use TensorBoard and don't even want to import it. Adding a dependency on TensorBoard in both mlagents_envs and mlagents would increase complexity. Using side channels allows for more flexibility since the data can be logged into anything from mlagents_envs and not just into TensorBoard. Is there an issue with the side channel API? Can you suggest some changes it its API to make it easier to use?

iceboy910447 commented 3 years ago

Hi @vincentpierre

I'd like to use the same executable with mlagents PPO variant and with DQN like the Rainbow variant in Ray RLlib. Currently its easy for me to use Statsrecorder to gather statistical data abount the agent and directly plot that in Tensorboard without any changes in the python code. But when i use my own DQN i have to change that, because i can't log the data into tensorboard by default. In this case i have to either disable StatsRecorder or implement an own custom sidechannel both in C# as well as in python. In libaries like Ray RLlib most of the tensorboard logging is handled internally so its harder to access and adapt. With that in mind I'm searching for a way to easily use Statsrecorder to log my data into Tensorboars while using other RL libaries that are harder to adapt. If there is a way to make that happen using the existing Side Channel APi i'd be happy to use that otherwise it would be nice change the API of mlagents_env so it can handle StatsRecorder to Tensorboard

vincentpierre commented 3 years ago

I think the simplest way to solve your problem is to use a StatsReporter with a TensorboardWriter and call add_stats and write stats when the side channel receives information. Putting this logic inside of UnityEnvironment is challenging, because not all users will want their stats reported at the same time or in the same manner.

jziegle4 commented 3 years ago

I will try that, but how can i connect the StatsRecorder on Unity side to a custom side channel on Python side? As i see in https://github.com/Unity-Technologies/ml-agents/blob/main/docs/Custom-SideChannels.md i have to reference the side channel using a ID. So far I could not find out how to connect StatsRecorder with a sidechannel. I hope you can help me with that?

vincentpierre commented 3 years ago

Hi @jziegle4 You can receive the stats using this side channel when you register it with your UnityEnvironment it will receive the messages from the Unity side StatsReporter

jziegle4 commented 3 years ago

Hi @vincentpierre,

I would like to try this, but currently when i follow this link i only get a 404 error. Maybe the link is incorrect or you are referrencing a different branch. I tried to see if i can find a file in the colab folder, but i couldn't find one. I hope you can help me find the file you referrenced.

vincentpierre commented 3 years ago

My bad. The document I tried to reference is here : https://github.com/Unity-Technologies/ml-agents/blob/main/ml-agents-envs/mlagents_envs/side_channel/stats_side_channel.py

github-actions[bot] commented 3 years ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.