Unity-Technologies / ml-agents

The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
https://unity.com/products/machine-learning-agents
Other
16.93k stars 4.14k forks source link

Training with eternal brain raises Exception #818

Closed OrtyUan closed 6 years ago

OrtyUan commented 6 years ago

Hi, I am using Unity 2017.1.03f and when I am trying to train my environment I have this problem:

"Exception ArgumentException: An item with the same key has already been added. System.ThrowHelper.ThrowArgumentException (System.ExceptionResource resource) (at :0) System.Collections.Generic.Dictionary2[TKey,TValue].Insert (TKey key, TValue value, System.Boolean add) (at <a90417619fac49d5924050304d0280bb>:0) System.Collections.Generic.Dictionary2[TKey,TValue].Add (TKey key, TValue value) (at :0) Brain.SendState (Agent agent, AgentInfo info) (at <4282bb72582e4d569cd9952b328765a8>:0) Agent.SendInfoToBrain () (at <4282bb72582e4d569cd9952b328765a8>:0) Agent.SendInfo () (at <4282bb72582e4d569cd9952b328765a8>:0) Academy.EnvironmentStep () (at <4282bb72582e4d569cd9952b328765a8>:0) Academy.FixedUpdate () (at <4282bb72582e4d569cd9952b328765a8>:0)"

The only brain present is set to "External" and the game starts properly, but the agent is not moving. When it is set to "player" tha game is working and the agent is moving.

mmattar commented 6 years ago

Hi @OrtyUan - how many Agents are attached to the Brain? What are their names?

OrtyUan commented 6 years ago

Only one agent is present in the scene. In Unity its name is "CarTraining" and the script attached to it is "CarController", which is an "Agent" class. I tried the Unity examples given in ML-Agents and they are working. I then added the ML-Agents Assets to my project, facing this problem.

vincentpierre commented 6 years ago

This error is usually symptomatic of a previous error. If you look at the list of errors Unity raises, there should be one on top of this one that will help us figure out what happened. Does your scene work with a Player agent ?

OrtyUan commented 6 years ago

Yes, the scene works correctly with a player brain. The rewards are correct and also the academy restart the agent when it has to. I am using Unity 2017.1.0f3 and the examples given in ML-Agents are working.

xiaomaogy commented 6 years ago

@OrtyUan Could you give the complete error message?

OrtyUan commented 6 years ago

This is the Error messagge that I find when i lunch the python program learn.py, setting the only existing brain to External:

Exception ArgumentException: An item with the same key has already been added. System.ThrowHelper.ThrowArgumentException (System.ExceptionResource resource) (at :0) System.Collections.Generic.Dictionary2[TKey,TValue].Insert (TKey key, TValue value, System.Boolean add) (at :0) System.Collections.Generic.Dictionary2[TKey,TValue].Add (TKey key, TValue value) (at :0) Brain.SendState (Agent agent, AgentInfo info) (at <4282bb72582e4d569cd9952b328765a8>:0) Agent.SendInfoToBrain () (at <4282bb72582e4d569cd9952b328765a8>:0) Agent.SendInfo () (at <4282bb72582e4d569cd9952b328765a8>:0) Academy.EnvironmentStep () (at <4282bb72582e4d569cd9952b328765a8>:0) Academy.FixedUpdate () (at <4282bb72582e4d569cd9952b328765a8>:0)

vincentpierre commented 6 years ago

There could be something that went wrong at the beginning of the simulation, this message is an error that happens every frame, there should be another error on Awake. If you look at the trace, there should be an error at the top. I have never seen this error alone. If this is the only error you get, There could be a silent problem with your agent. What are you doing with your agent ? Are you trying to access the Academy ? Are you destroying objects in the Agent script?

awjuliani commented 6 years ago

Hi all,

I am closing this thread due to inactivity. Please feel free to re-open it if desired.

HodaAmir commented 5 years ago

Hi all, I am following the "Making a new learning environment" example, but when I want to test my environment manually using RollerBallPlayer I get this error that I can not fix: ArgumentException: An item with the same key has already been added. Key: RollerAgent (RollerAgent) System.Collections.Generic.Dictionary2[TKey,TValue].TryInsert (TKey key, TValue value, System.Collections.Generic.InsertionBehavior behavior) (at <d7ac571ca2d04b2f981d0d886fa067cf>:0) System.Collections.Generic.Dictionary2[TKey,TValue].Add (TKey key, TValue value) (at :0) MLAgents.Brain.SendState (MLAgents.Agent agent, MLAgents.AgentInfo info) (at Assets/ML-Agents/Scripts/Brain.cs:56) MLAgents.Agent.SendInfoToBrain () (at Assets/ML-Agents/Scripts/Agent.cs:619)

I am using Windows 10, Unity 2018.3.5f1. Any help is appreciated.

YongyiTang92 commented 5 years ago

Hi all, I am following the "Making a new learning environment" example, but when I want to test my environment manually using RollerBallPlayer I get this error that I can not fix: ArgumentException: An item with the same key has already been added. Key: RollerAgent (RollerAgent) System.Collections.Generic.Dictionary2[TKey,TValue].TryInsert (TKey key, TValue value, System.Collections.Generic.InsertionBehavior behavior) (at <d7ac571ca2d04b2f981d0d886fa067cf>:0) System.Collections.Generic.Dictionary2[TKey,TValue].Add (TKey key, TValue value) (at :0) MLAgents.Brain.SendState (MLAgents.Agent agent, MLAgents.AgentInfo info) (at Assets/ML-Agents/Scripts/Brain.cs:56) MLAgents.Agent.SendInfoToBrain () (at Assets/ML-Agents/Scripts/Agent.cs:619)

I am using Windows 10, Unity 2018.3.5f1. Any help is appreciated.

I run into the same problem. And I figure out that I incorrectly add the learning brain instead of the player brain. Drag the playerBrain will fix the problem.

Zilch123 commented 4 years ago

I too ran into the same problem, since I forgot to mention the payer brain inputs properly.

github-actions[bot] commented 3 years ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.