Unity-Technologies / ml-agents

The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
https://unity.com/products/machine-learning-agents
Other
16.59k stars 4.09k forks source link

ImportError: symbol not found in flat namespace '_CFRelease' #6087

Closed dattienle2573 closed 1 month ago

dattienle2573 commented 3 months ago

Describe the bug A clear and concise description of what the bug is.

To Reproduce Steps to reproduce the behavior: I followed the steps on https://huggingface.co/learn/deep-rl-course/unit7/hands-on

  1. open terminal and paste these code in (assumed that you have miniconda/miniforge/etc)
conda create --name rl python=3.10.12 -y
conda activate rl
  1. run this in terminal

git clone https://github.com/Unity-Technologies/ml-agents

  1. create requirements.txt file
absl-py==2.1.0
attrs==23.2.0
cattrs==1.5.0
certifi==2024.2.2
charset-normalizer==3.3.2
cloudpickle==3.0.0
filelock==3.13.1
fsspec==2024.3.1
grpcio==1.48.2
grpcio-tools==1.46.5
gym==0.26.2
gym-notices==0.0.8
h5py==3.10.0
huggingface-hub==0.21.4
idna==3.6
Jinja2==3.1.3
Markdown==3.6
MarkupSafe==2.1.5
mpmath==1.3.0
networkx==3.2.1
numpy==1.23.5
onnx==1.15.0
packaging==24.0
PettingZoo==1.15.0
pillow==10.2.0
pipdeptree==2.16.1
protobuf==3.20.3
PyYAML==6.0.1
requests==2.31.0
six==1.16.0
sympy==1.12
tensorboard==2.16.2
tensorboard-data-server==0.7.2
torch==2.2.1
tqdm==4.66.2
typing_extensions==4.10.0
urllib3==2.2.1
Werkzeug==3.0.1
  1. run pip install
pip install -r requirements.txt
pip install -e ./ml-agents-envs
pip install -e ./ml-agents
  1. download this file
https://drive.google.com/drive/folders/1h7YB0qwjoxxghApQdEUQmk95ZwIDxrPG

put the whole file in new directory called training-envs-executables then run

xattr -cr training-envs-executables/SoccerTwos/SoccerTwos.app
  1. Finally run this to train the agents
mlagents-learn ./config/poca/SoccerTwos.yaml --env=./training-envs-executables/SoccerTwos/SoccerTwos.app --run-id="SoccerTwos" --no-graphics

Console logs / stack traces

Traceback (most recent call last):
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/bin/mlagents-learn", line 33, in <module>
    sys.exit(load_entry_point('mlagents', 'console_scripts', 'mlagents-learn')())
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/bin/mlagents-learn", line 25, in importlib_load_entry_point
    return next(matches).load()
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/lib/python3.10/importlib/metadata/__init__.py", line 171, in load
    module = import_module(match.group('module'))
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/dat/ml-agents/src/mlagents/ml-agents/mlagents/trainers/learn.py", line 2, in <module>
    from mlagents import torch_utils
  File "/Users/dat/ml-agents/src/mlagents/ml-agents/mlagents/torch_utils/__init__.py", line 1, in <module>
    from mlagents.torch_utils.torch import torch as torch  # noqa
  File "/Users/dat/ml-agents/src/mlagents/ml-agents/mlagents/torch_utils/torch.py", line 6, in <module>
    from mlagents.trainers.settings import TorchSettings
  File "/Users/dat/ml-agents/src/mlagents/ml-agents/mlagents/trainers/settings.py", line 25, in <module>
    from mlagents.trainers.cli_utils import StoreConfigFile, DetectDefault, parser
  File "/Users/dat/ml-agents/src/mlagents/ml-agents/mlagents/trainers/cli_utils.py", line 5, in <module>
    from mlagents_envs.environment import UnityEnvironment
  File "/Users/dat/ml-agents/src/mlagents-envs/ml-agents-envs/mlagents_envs/environment.py", line 49, in <module>
    from .rpc_communicator import RpcCommunicator
  File "/Users/dat/ml-agents/src/mlagents-envs/ml-agents-envs/mlagents_envs/rpc_communicator.py", line 1, in <module>
    import grpc
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/lib/python3.10/site-packages/grpc/__init__.py", line 22, in <module>
    from grpc import _compression
  File "/opt/homebrew/Caskroom/miniforge/base/envs/rl/lib/python3.10/site-packages/grpc/_compression.py", line 15, in <module>
    from grpc._cython import cygrpc
ImportError: dlopen(/opt/homebrew/Caskroom/miniforge/base/envs/rl/lib/python3.10/site-packages/grpc/_cython/cygrpc.cpython-310-darwin.so, 0x0002): symbol not found in flat namespace '_CFRelease'

Environment (please complete the following information):

github-actions[bot] commented 2 months ago

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] commented 1 month ago

This issue was closed because it has been inactive for 14 days since being marked as stale. Please open a new issue for related bugs.