Unity-Technologies / ml-agents

The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
https://unity.com/products/machine-learning-agents
Other
16.95k stars 4.13k forks source link

ERROR: Failed building wheel for numpy when installing ml-agents==1.0.0 #6008

Closed ohernpaul closed 10 months ago

ohernpaul commented 10 months ago

In the guide and documentation, it states that 1.0.0 Requires-Python >=3.10.1,<=3.10.12. Given any Anaconda environment between this range, I am failing to build wheel for numpy. The version that ml-agents 1.0 is trying to use is numpy 1.13.3 and fails with this error.

× Building wheel for numpy (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [287 lines of output] setup.py:63: RuntimeWarning: NumPy 1.21.2 may not yet support Python 3.10.

The steps I am taking are:

  1. Create a new conda environment within the given python version range defined above.
  2. Navigate to the directory containing the ml-agents and ml-agents-envs folders.
  3. Calling pip install ./ml-agents
  4. Failure message

I have even tried installing numpy 1.21.2 which is the version for python 3.10 and it still fails.

To make sure my Anaconda setup is correct, I went through a re-install of my previous environment for ML-Agents==0.28.0 and that worked perfectly fine the first time using python 3.7.16.

(mlagents_python_3_10_12) D:\MProjects\AIsland\ml-agents-master\ml-agents>pip install ./
Processing d:\mprojects\aisland\ml-agents-master\ml-agents
  Preparing metadata (setup.py) ... done
Collecting grpcio<=1.48.2,>=1.11.0 (from mlagents==1.0.0)
  Using cached grpcio-1.48.2-cp310-cp310-win_amd64.whl (3.6 MB)
Collecting h5py>=2.9.0 (from mlagents==1.0.0)
  Using cached h5py-3.10.0-cp310-cp310-win_amd64.whl.metadata (2.5 kB)
Collecting mlagents_envs==1.0.0 (from mlagents==1.0.0)
  Using cached mlagents_envs-1.0.0-py3-none-any.whl.metadata (2.4 kB)
Requirement already satisfied: numpy<2.0,>=1.13.3 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from mlagents==1.0.0) (1.22.1)
Collecting Pillow>=4.2.1 (from mlagents==1.0.0)
  Using cached Pillow-10.1.0-cp310-cp310-win_amd64.whl.metadata (9.6 kB)
Collecting protobuf<3.20,>=3.6 (from mlagents==1.0.0)
  Using cached protobuf-3.19.6-cp310-cp310-win_amd64.whl (895 kB)
Collecting pyyaml>=3.1.0 (from mlagents==1.0.0)
  Using cached PyYAML-6.0.1-cp310-cp310-win_amd64.whl.metadata (2.1 kB)
Collecting torch>=1.13.1 (from mlagents==1.0.0)
  Using cached torch-2.1.0-cp310-cp310-win_amd64.whl.metadata (24 kB)
Collecting tensorboard>=2.14 (from mlagents==1.0.0)
  Using cached tensorboard-2.15.1-py3-none-any.whl.metadata (1.7 kB)
Requirement already satisfied: six>=1.16 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from mlagents==1.0.0) (1.16.0)
Collecting attrs>=19.3.0 (from mlagents==1.0.0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting huggingface_hub>=0.14 (from mlagents==1.0.0)
  Using cached huggingface_hub-0.19.0-py3-none-any.whl.metadata (13 kB)
Collecting onnx==1.12.0 (from mlagents==1.0.0)
  Using cached onnx-1.12.0-cp310-cp310-win_amd64.whl (11.5 MB)
Collecting pypiwin32==223 (from mlagents==1.0.0)
  Using cached pypiwin32-223-py3-none-any.whl (1.7 kB)
Collecting cattrs<1.7,>=1.1.0 (from mlagents==1.0.0)
  Using cached cattrs-1.5.0-py3-none-any.whl (19 kB)
Collecting cloudpickle (from mlagents_envs==1.0.0->mlagents==1.0.0)
  Using cached cloudpickle-3.0.0-py3-none-any.whl.metadata (7.0 kB)
Collecting gym>=0.21.0 (from mlagents_envs==1.0.0->mlagents==1.0.0)
  Using cached gym-0.26.2-py3-none-any.whl
Collecting pettingzoo==1.15.0 (from mlagents_envs==1.0.0->mlagents==1.0.0)
  Using cached PettingZoo-1.15.0-py3-none-any.whl
Collecting numpy<2.0,>=1.13.3 (from mlagents==1.0.0)
  Using cached numpy-1.21.2.zip (10.3 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting filelock>=3.4.0 (from mlagents_envs==1.0.0->mlagents==1.0.0)
  Using cached filelock-3.13.1-py3-none-any.whl.metadata (2.8 kB)
Collecting typing-extensions>=3.6.2.1 (from onnx==1.12.0->mlagents==1.0.0)
  Using cached typing_extensions-4.8.0-py3-none-any.whl.metadata (3.0 kB)
Collecting pywin32>=223 (from pypiwin32==223->mlagents==1.0.0)
  Using cached pywin32-306-cp310-cp310-win_amd64.whl (9.2 MB)
Collecting fsspec>=2023.5.0 (from huggingface_hub>=0.14->mlagents==1.0.0)
  Using cached fsspec-2023.10.0-py3-none-any.whl.metadata (6.8 kB)
Requirement already satisfied: requests in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from huggingface_hub>=0.14->mlagents==1.0.0) (2.31.0)
Collecting tqdm>=4.42.1 (from huggingface_hub>=0.14->mlagents==1.0.0)
  Using cached tqdm-4.66.1-py3-none-any.whl.metadata (57 kB)
Requirement already satisfied: packaging>=20.9 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from huggingface_hub>=0.14->mlagents==1.0.0) (23.2)
Collecting absl-py>=0.4 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached absl_py-2.0.0-py3-none-any.whl.metadata (2.3 kB)
Collecting google-auth<3,>=1.6.3 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached google_auth-2.23.4-py2.py3-none-any.whl.metadata (4.7 kB)
Collecting google-auth-oauthlib<2,>=0.5 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached google_auth_oauthlib-1.1.0-py2.py3-none-any.whl.metadata (2.7 kB)
Collecting markdown>=2.6.8 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached Markdown-3.5.1-py3-none-any.whl.metadata (7.1 kB)
Requirement already satisfied: setuptools>=41.0.0 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from tensorboard>=2.14->mlagents==1.0.0) (68.0.0)
Collecting tensorboard-data-server<0.8.0,>=0.7.0 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached tensorboard_data_server-0.7.2-py3-none-any.whl.metadata (1.1 kB)
Collecting werkzeug>=1.0.1 (from tensorboard>=2.14->mlagents==1.0.0)
  Using cached werkzeug-3.0.1-py3-none-any.whl.metadata (4.1 kB)
Collecting sympy (from torch>=1.13.1->mlagents==1.0.0)
  Using cached sympy-1.12-py3-none-any.whl (5.7 MB)
Collecting networkx (from torch>=1.13.1->mlagents==1.0.0)
  Using cached networkx-3.2.1-py3-none-any.whl.metadata (5.2 kB)
Collecting jinja2 (from torch>=1.13.1->mlagents==1.0.0)
  Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting cachetools<6.0,>=2.0.0 (from google-auth<3,>=1.6.3->tensorboard>=2.14->mlagents==1.0.0)
  Using cached cachetools-5.3.2-py3-none-any.whl.metadata (5.2 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.6.3->tensorboard>=2.14->mlagents==1.0.0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.6.3->tensorboard>=2.14->mlagents==1.0.0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<2,>=0.5->tensorboard>=2.14->mlagents==1.0.0)
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting gym-notices>=0.0.4 (from gym>=0.21.0->mlagents_envs==1.0.0->mlagents==1.0.0)
  Using cached gym_notices-0.0.8-py3-none-any.whl (3.0 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from requests->huggingface_hub>=0.14->mlagents==1.0.0) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from requests->huggingface_hub>=0.14->mlagents==1.0.0) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from requests->huggingface_hub>=0.14->mlagents==1.0.0) (2.0.7)
Requirement already satisfied: certifi>=2017.4.17 in c:\programdata\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages (from requests->huggingface_hub>=0.14->mlagents==1.0.0) (2023.7.22)
Collecting colorama (from tqdm>=4.42.1->huggingface_hub>=0.14->mlagents==1.0.0)
  Using cached colorama-0.4.6-py2.py3-none-any.whl (25 kB)
Collecting MarkupSafe>=2.1.1 (from werkzeug>=1.0.1->tensorboard>=2.14->mlagents==1.0.0)
  Using cached MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl.metadata (3.1 kB)
Collecting mpmath>=0.19 (from sympy->torch>=1.13.1->mlagents==1.0.0)
  Using cached mpmath-1.3.0-py3-none-any.whl (536 kB)
Collecting pyasn1<0.6.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard>=2.14->mlagents==1.0.0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Collecting oauthlib>=3.0.0 (from requests-oauthlib>=0.7.0->google-auth-oauthlib<2,>=0.5->tensorboard>=2.14->mlagents==1.0.0)
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Using cached mlagents_envs-1.0.0-py3-none-any.whl (89 kB)
Using cached h5py-3.10.0-cp310-cp310-win_amd64.whl (2.7 MB)
Using cached huggingface_hub-0.19.0-py3-none-any.whl (311 kB)
Using cached Pillow-10.1.0-cp310-cp310-win_amd64.whl (2.6 MB)
Using cached PyYAML-6.0.1-cp310-cp310-win_amd64.whl (145 kB)
Using cached tensorboard-2.15.1-py3-none-any.whl (5.5 MB)
Using cached torch-2.1.0-cp310-cp310-win_amd64.whl (192.3 MB)
Using cached absl_py-2.0.0-py3-none-any.whl (130 kB)
Using cached filelock-3.13.1-py3-none-any.whl (11 kB)
Using cached fsspec-2023.10.0-py3-none-any.whl (166 kB)
Using cached google_auth-2.23.4-py2.py3-none-any.whl (183 kB)
Using cached google_auth_oauthlib-1.1.0-py2.py3-none-any.whl (19 kB)
Using cached cloudpickle-3.0.0-py3-none-any.whl (20 kB)
Using cached Markdown-3.5.1-py3-none-any.whl (102 kB)
Using cached tensorboard_data_server-0.7.2-py3-none-any.whl (2.4 kB)
Using cached tqdm-4.66.1-py3-none-any.whl (78 kB)
Using cached typing_extensions-4.8.0-py3-none-any.whl (31 kB)
Using cached werkzeug-3.0.1-py3-none-any.whl (226 kB)
Using cached networkx-3.2.1-py3-none-any.whl (1.6 MB)
Using cached cachetools-5.3.2-py3-none-any.whl (9.3 kB)
Using cached MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl (17 kB)
Building wheels for collected packages: mlagents, numpy
  Building wheel for mlagents (setup.py) ... done
  Created wheel for mlagents: filename=mlagents-1.0.0-py3-none-any.whl size=171318 sha256=8cd896e844614084a99f660a72a708bfd9be60ead419ff8915ceccb4bdc2d15a
  Stored in directory: c:\users\TestUser\appdata\local\pip\cache\wheels\e7\6c\d6\fd5823c0d9af4009e7dbbe70b807444668674059594355c808
  Building wheel for numpy (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for numpy (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [287 lines of output]
      setup.py:63: RuntimeWarning: NumPy 1.21.2 may not yet support Python 3.10.
        warnings.warn(
      Running from numpy source directory.
      Processing numpy/random\_bounded_integers.pxd.in
      Processing numpy/random\bit_generator.pyx
      Processing numpy/random\mtrand.pyx
      Processing numpy/random\_bounded_integers.pyx.in
      Processing numpy/random\_common.pyx
      Processing numpy/random\_generator.pyx
      Processing numpy/random\_mt19937.pyx
      Processing numpy/random\_pcg64.pyx
      Processing numpy/random\_philox.pyx
      Processing numpy/random\_sfc64.pyx
      Cythonizing sources
      blas_opt_info:
      blas_mkl_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries mkl_rt not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      blis_info:
        libraries blis not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      openblas_info:
        libraries openblas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
      get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']'
      customize GnuFCompiler
      Could not locate executable g77
      Could not locate executable f77
      customize IntelVisualFCompiler
      Could not locate executable ifort
      Could not locate executable ifl
      customize AbsoftFCompiler
      Could not locate executable f90
      customize CompaqVisualFCompiler
      Could not locate executable DF
      customize IntelItaniumVisualFCompiler
      Could not locate executable efl
      customize Gnu95FCompiler
      Could not locate executable gfortran
      Could not locate executable f95
      customize G95FCompiler
      Could not locate executable g95
      customize IntelEM64VisualFCompiler
      customize IntelEM64TFCompiler
      Could not locate executable efort
      Could not locate executable efc
      customize PGroupFlangCompiler
      Could not locate executable flang
      don't know how to compile Fortran code on platform 'nt'
        NOT AVAILABLE

      accelerate_info:
        NOT AVAILABLE

      atlas_3_10_blas_threads_info:
      Setting PTATLAS=ATLAS
        libraries tatlas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      atlas_3_10_blas_info:
        libraries satlas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      atlas_blas_threads_info:
      Setting PTATLAS=ATLAS
        libraries ptf77blas,ptcblas,atlas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      atlas_blas_info:
        libraries f77blas,cblas,atlas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\system_info.py:2026: UserWarning:
          Optimized (vendor) Blas libraries are not found.
          Falls back to netlib Blas library which has worse performance.
          A better performance should be easily gained by switching
          Blas library.
        if self._calc_info(blas):
      blas_info:
        libraries blas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\system_info.py:2026: UserWarning:
          Blas (http://www.netlib.org/blas/) libraries not found.
          Directories to search for the libraries can be specified in the
          numpy/distutils/site.cfg file (section [blas]) or by setting
          the BLAS environment variable.
        if self._calc_info(blas):
      blas_src_info:
        NOT AVAILABLE

      C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\system_info.py:2026: UserWarning:
          Blas (http://www.netlib.org/blas/) sources not found.
          Directories to search for the sources can be specified in the
          numpy/distutils/site.cfg file (section [blas_src]) or by setting
          the BLAS_SRC environment variable.
        if self._calc_info(blas):
        NOT AVAILABLE

      non-existing path in 'numpy\\distutils': 'site.cfg'
      lapack_opt_info:
      lapack_mkl_info:
        libraries mkl_rt not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      openblas_lapack_info:
        libraries openblas not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      openblas_clapack_info:
        libraries openblas,lapack not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      flame_info:
        libraries flame not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      atlas_3_10_threads_info:
      Setting PTATLAS=ATLAS
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries tatlas,tatlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries lapack_atlas not found in C:\
        libraries tatlas,tatlas not found in C:\
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries tatlas,tatlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\Library\lib
        libraries tatlas,tatlas not found in C:\ProgramData\anaconda3\Library\lib
      <class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
        NOT AVAILABLE

      atlas_3_10_info:
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries satlas,satlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries lapack_atlas not found in C:\
        libraries satlas,satlas not found in C:\
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries satlas,satlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\Library\lib
        libraries satlas,satlas not found in C:\ProgramData\anaconda3\Library\lib
      <class 'numpy.distutils.system_info.atlas_3_10_info'>
        NOT AVAILABLE

      atlas_threads_info:
      Setting PTATLAS=ATLAS
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries ptf77blas,ptcblas,atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries lapack_atlas not found in C:\
        libraries ptf77blas,ptcblas,atlas not found in C:\
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries ptf77blas,ptcblas,atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\Library\lib
        libraries ptf77blas,ptcblas,atlas not found in C:\ProgramData\anaconda3\Library\lib
      <class 'numpy.distutils.system_info.atlas_threads_info'>
        NOT AVAILABLE

      atlas_info:
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries f77blas,cblas,atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib
        libraries lapack_atlas not found in C:\
        libraries f77blas,cblas,atlas not found in C:\
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries f77blas,cblas,atlas not found in C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\libs
        libraries lapack_atlas not found in C:\ProgramData\anaconda3\Library\lib
        libraries f77blas,cblas,atlas not found in C:\ProgramData\anaconda3\Library\lib
      <class 'numpy.distutils.system_info.atlas_info'>
        NOT AVAILABLE

      lapack_info:
        libraries lapack not found in ['C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\lib', 'C:\\', 'C:\\ProgramData\\anaconda3\\envs\\mlagents_python_3_10_12\\libs', 'C:\\ProgramData\\anaconda3\\Library\\lib']
        NOT AVAILABLE

      C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\system_info.py:1858: UserWarning:
          Lapack (http://www.netlib.org/lapack/) libraries not found.
          Directories to search for the libraries can be specified in the
          numpy/distutils/site.cfg file (section [lapack]) or by setting
          the LAPACK environment variable.
        return getattr(self, '_calc_info_{}'.format(name))()
      lapack_src_info:
        NOT AVAILABLE

      C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\system_info.py:1858: UserWarning:
          Lapack (http://www.netlib.org/lapack/) sources not found.
          Directories to search for the sources can be specified in the
          numpy/distutils/site.cfg file (section [lapack_src]) or by setting
          the LAPACK_SRC environment variable.
        return getattr(self, '_calc_info_{}'.format(name))()
        NOT AVAILABLE

      numpy_linalg_lapack_lite:
        FOUND:
          language = c
          define_macros = [('HAVE_BLAS_ILP64', None), ('BLAS_SYMBOL_SUFFIX', '64_')]

      Warning: attempted relative import with no known parent package
      C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\dist.py:275: UserWarning: Unknown distribution option: 'define_macros'
        warnings.warn(msg)
      running bdist_wheel
      running build
      running config_cc
      unifing config_cc, config, build_clib, build_ext, build commands --compiler options
      running config_fc
      unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
      running build_src
      build_src
      building py_modules sources
      creating build
      creating build\src.win-amd64-3.10
      creating build\src.win-amd64-3.10\numpy
      creating build\src.win-amd64-3.10\numpy\distutils
      building library "npymath" sources
      Traceback (most recent call last):
        File "C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "C:\ProgramData\anaconda3\envs\mlagents_python_3_10_12\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 251, in build_wheel
          return _build_backend().build_wheel(wheel_directory, config_settings,
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\build_meta.py", line 211, in build_wheel
          return self._build_with_temp_dir(['bdist_wheel'], '.whl',
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\build_meta.py", line 197, in _build_with_temp_dir
          self.run_setup()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\build_meta.py", line 248, in run_setup
          super(_BuildMetaLegacyBackend,
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\build_meta.py", line 142, in run_setup
          exec(compile(code, __file__, 'exec'), locals())
        File "setup.py", line 448, in <module>
          setup_package()
        File "setup.py", line 440, in setup_package
          setup(**metadata)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\core.py", line 169, in setup
          return old_setup(**new_attr)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\__init__.py", line 165, in setup
          return distutils.core.setup(**attrs)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\core.py", line 148, in setup
          dist.run_commands()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 967, in run_commands
          self.run_command(cmd)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 986, in run_command
          cmd_obj.run()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\wheel\bdist_wheel.py", line 299, in run
          self.run_command('build')
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 313, in run_command
          self.distribution.run_command(command)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 986, in run_command
          cmd_obj.run()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\build.py", line 61, in run
          old_build.run(self)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\command\build.py", line 135, in run
          self.run_command(cmd_name)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 313, in run_command
          self.distribution.run_command(command)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 986, in run_command
          cmd_obj.run()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\build_src.py", line 144, in run
          self.build_sources()
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\build_src.py", line 155, in build_sources
          self.build_library_sources(*libname_info)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\build_src.py", line 288, in build_library_sources
          sources = self.generate_sources(sources, (lib_name, build_info))
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\build_src.py", line 378, in generate_sources
          source = func(extension, build_dir)
        File "numpy\core\setup.py", line 661, in get_mathlib_info
          st = config_cmd.try_link('int main(void) { return 0;}')
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 243, in try_link
          self._link(body, headers, include_dirs,
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\config.py", line 163, in _link
          return self._wrap_method(old_config._link, lang,
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\config.py", line 98, in _wrap_method
          ret = mth(*((self,)+args))
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 137, in _link
          (src, obj) = self._compile(body, headers, include_dirs, lang)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\config.py", line 106, in _compile
          src, obj = self._wrap_method(old_config._compile, lang,
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\command\config.py", line 98, in _wrap_method
          ret = mth(*((self,)+args))
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 132, in _compile
          self.compiler.compile([src], include_dirs=include_dirs)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\_msvccompiler.py", line 401, in compile
          self.spawn(args)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-build-env-dpw6tl0y\overlay\Lib\site-packages\setuptools\_distutils\_msvccompiler.py", line 505, in spawn
          return super().spawn(cmd, env=env)
        File "C:\Users\TestUser\AppData\Local\Temp\pip-install-6f2rafsg\numpy_b49ee96b58fa487f998a6367e070897a\numpy\distutils\ccompiler.py", line 88, in <lambda>
          m = lambda self, *args, **kw: func(self, *args, **kw)
      TypeError: CCompiler_spawn() got an unexpected keyword argument 'env'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for numpy
Successfully built mlagents
Failed to build numpy
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
azazelcodes commented 10 months ago

getting this aswell. im using python -m pip install ./ml-agents-envs python -m pip install ./ml-agents though.

azazelcodes commented 10 months ago

fixed it by downgrading to python 3.9.13

ohernpaul commented 10 months ago

fixed it by downgrading to python 3.9.13

Are you trying to install Release 21? Aka ml-agents==1.0.0?

Because I attempted on python 3.9.13 but am still getting this error.

(mlagents_python_3_9_13) D:\MProjects\AIsland\ml-agents-master>pip install ./ml-agents
Processing d:\mprojects\aisland\ml-agents-master\ml-agents
  Preparing metadata (setup.py) ... done
Collecting grpcio<=1.48.2,>=1.11.0 (from mlagents==1.0.0)
  Downloading grpcio-1.48.2-cp39-cp39-win_amd64.whl (3.6 MB)
     ---------------------------------------- 3.6/3.6 MB 15.3 MB/s eta 0:00:00
Collecting h5py>=2.9.0 (from mlagents==1.0.0)
  Downloading h5py-3.10.0-cp39-cp39-win_amd64.whl.metadata (2.5 kB)
INFO: pip is looking at multiple versions of mlagents to determine which version is compatible with other requirements. This could take a while.
ERROR: Ignored the following versions that require a different python version: 0.10.0.dev0 Requires-Python >=3.5,<3.8; 0.6.0 Requires-Python >=3.5,<=3.7; 0.6.1 Requires-Python >=3.5,<=3.7; 0.6.2 Requires-Python >=3.5,<=3.7; 0.8.0 Requires-Python >=3.5,<3.8; 0.8.1 Requires-Python >=3.5,<3.8; 0.8.2 Requires-Python >=3.5,<3.8; 0.9.0 Requires-Python >=3.5,<3.8; 0.9.1 Requires-Python >=3.5,<3.8; 0.9.2 Requires-Python >=3.5,<3.8; 0.9.3 Requires-Python >=3.5,<3.8; 1.0.0 Requires-Python >=3.10.1,<=3.10.12
ERROR: Could not find a version that satisfies the requirement mlagents_envs==1.0.0 (from mlagents) (from versions: 0.10.0.dev1, 0.10.0, 0.10.1, 0.11.0.dev0, 0.11.0, 0.12.0, 0.12.1, 0.13.0, 0.13.1, 0.14.0, 0.14.1, 0.15.0, 0.15.1, 0.16.0, 0.16.1, 0.17.0, 0.18.0, 0.18.1, 0.19.0, 0.20.0, 0.21.0, 0.21.1, 0.22.0, 0.23.0, 0.24.0, 0.24.1, 0.25.0, 0.25.1, 0.26.0, 0.27.0, 0.28.0, 0.29.0, 0.30.0)
ERROR: No matching distribution found for mlagents_envs==1.0.0
1.0.0 Requires-Python >=3.10.1,<=3.10.12
azazelcodes commented 10 months ago

oh yeah. i cloned release_20 instead. as far as i know thats how you install that early of a numpy version though. weird that 21 requires a different py version

wanghang2000221 commented 10 months ago

I faced this question too. i tried py3.11 but not success yet

wanghang2000221 commented 10 months ago

i install numpy=1.26 but it install numpy=1.21 again

DDoubleMaster commented 10 months ago

You have already found a solution, I did it. it may not be perfect and there may be some inconsistencies, but I got it. Maybe in the future it won't work, but for now it does. I used:

  1. Copilot
  2. Installation documentation https://github.com/Unity-Technologies/ml-agents/blob/develop/docs/Installation.md
  3. Video of a youtuber https://www.youtube.com/watch?v=RANRz9oyzko
  4. This chat.

(read all the way to the end and then try it)

Anyway, I did all this on python 3.9.13, this method works in both anaconda and venv (it's about the same, but I said anyway to make people more comfortable). After creating the environment you can upgrade pip (currently it works with or without upgrading) but I've always upgraded it so here's the pip upgrade code (if you need it):

python -m pip install --upgrade pip

And now you can start installing PyTorch with this command (this command is taken from the documentation):

pip3 install torch~=1.13.1 -f https://download.pytorch.org/whl/torch_stable.html

It doesn't work the usual way, so it's better to use this one. Then we need to install Protobuf version 3.20.3 (I've seen it in the video, I haven't tried other versions, but it works with this version):

pip install protobuf==3.20.3

And now you can install ML Agents itself, this is done with the command:

pip install mlagents

Yes, I just installed mlagents from Pypi, because mlagents from the repository doesn't allow installation if python version is lower than python 3.10, and it doesn't install on python 3.10. The commands that install PyTorch, ProtoBuf and ML Agents can be installed scattered and will still work (if you install everything). And tadam, now in theory it should work, you can check it by running the command:

mlagents-learn -h

If you see the list of commands, you have done everything correctly and everything works. I tried several variants, several download combinations and came to this. I hope I have helped someone with this.

Translated with www.DeepL.com/Translator (free version) 😀 (Sorry if you see any spelling or logical errors.)

ohernpaul commented 10 months ago

Just a heads up for anyone that's still stuck on this. The numpy version was incorrect and has been updated on the develop branch - I was told by a Unity employee and confirmed with this link on the ml-agents commit history.

https://github.com/Unity-Technologies/ml-agents/commit/f3dc8f615044c9226c7e7ed308e0aadc1def3b4d

wanghang2000221 commented 10 months ago

thank you very much and i will try it soon

---Original--- From: @.> Date: Thu, Nov 16, 2023 23:55 PM To: @.>; Cc: @.**@.>; Subject: Re: [Unity-Technologies/ml-agents] ERROR: Failed building wheel fornumpy when installing ml-agents==1.0.0 (Issue #6008)

Just a heads up for anyone that's still stuck on this. The numpy version was incorrect and has been updated on the develop branch - I was told by a Unity employee and confirmed with this link on the ml-agents commit history.

f3dc8f6

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

onurkurum commented 10 months ago

Workaround: Changed "numpy==1.21.2", to "numpy==1.23.3",

in the file path-to-ml-agents\ml-agents-envs\setup.py

numpy 1.21.2 is not compatible with python 3.10.12 (I dont see a cp310 for win64 bit: https://pypi.org/project/numpy/1.21.2/#files)

numpy 1.23.3 works fine with python 3.10.12

note: with numpy 1.24, I got a float error with numpy which led to this:

NumPy 1.20 (release notes) deprecated numpy.float, numpy.int, and similar aliases, causing them to issue a deprecation warning NumPy 1.24 (release notes) removed these aliases altogether, causing an error when they are used.

final pip list

Package                 Version
----------------------- ------------
absl-py                 2.0.0
attrs                   23.1.0
cachetools              5.3.2
cattrs                  1.5.0
certifi                 2023.7.22
charset-normalizer      3.3.2
cloudpickle             3.0.0
colorama                0.4.6
filelock                3.13.1
fsspec                  2023.10.0
google-auth             2.23.4
google-auth-oauthlib    1.1.0
grpcio                  1.48.2
gym                     0.26.2
gym-notices             0.0.8
h5py                    3.10.0
huggingface-hub         0.19.4
idna                    3.4
Markdown                3.5.1
MarkupSafe              2.1.3
mlagents                1.0.0
mlagents-envs           1.0.0
numpy                   1.23.3
oauthlib                3.2.2
onnx                    1.12.0
packaging               23.2
PettingZoo              1.15.0
Pillow                  10.1.0
pip                     23.3.1
protobuf                3.19.6
pyasn1                  0.5.0
pyasn1-modules          0.3.0
pypiwin32               223
pywin32                 306
PyYAML                  6.0.1
requests                2.31.0
requests-oauthlib       1.3.1
rsa                     4.9
setuptools              65.5.0
six                     1.16.0
tensorboard             2.15.1
tensorboard-data-server 0.7.2
torch                   1.13.1+cu117
tqdm                    4.66.1
typing_extensions       4.8.0
urllib3                 2.1.0
Werkzeug                3.0.1

confirmation:

>mlagents-learn --help
usage: mlagents-learn.exe [-h] [--env ENV_PATH] [--resume] [--deterministic] [--force] [--run-id RUN_ID] [--initialize-from RUN_ID] [--seed SEED] [--inference] [--base-port BASE_PORT] [--num-envs NUM_ENVS]
                          [--num-areas NUM_AREAS] [--debug] [--env-args ...] [--max-lifetime-restarts MAX_LIFETIME_RESTARTS] [--restarts-rate-limit-n RESTARTS_RATE_LIMIT_N]
                          [--restarts-rate-limit-period-s RESTARTS_RATE_LIMIT_PERIOD_S] [--torch] [--tensorflow] [--results-dir RESULTS_DIR] [--timeout-wait TIMEOUT_WAIT] [--width WIDTH] [--height HEIGHT]
                          [--quality-level QUALITY_LEVEL] [--time-scale TIME_SCALE] [--target-frame-rate TARGET_FRAME_RATE] [--capture-frame-rate CAPTURE_FRAME_RATE] [--no-graphics] [--torch-device DEVICE]
                          [trainer_config_path]

positional arguments:
  trainer_config_path

options:
  -h, --help            show this help message and exit
  --env ENV_PATH        Path to the Unity executable to train (default: None)
  --resume              Whether to resume training from a checkpoint. Specify a --run-id to use this option. If set, the training code loads an already trained model to initialize the neural network before
                        resuming training. This option is only valid when the models exist, and have the same behavior names as the current agents in your scene. (default: False)
  --deterministic       Whether to select actions deterministically in policy. `dist.mean` for continuous action space, and `dist.argmax` for deterministic action space (default: False)
  --force               Whether to force-overwrite this run-id's existing summary and model data. (Without this flag, attempting to train a model with a run-id that has been used before will throw an
                        error. (default: False)
  --run-id RUN_ID       The identifier for the training run. This identifier is used to name the subdirectories in which the trained model and summary statistics are saved as well as the saved model
                        itself. If you use TensorBoard to view the training statistics, always set a unique run-id for each training run. (The statistics for all runs with the same id are combined as if
                        they were produced by a the same session.) (default: ppo)
  --initialize-from RUN_ID
                        Specify a previously saved run ID from which to initialize the model from. This can be used, for instance, to fine-tune an existing model on a new environment. Note that the
                        previously saved models must have the same behavior parameters as your current environment. (default: None)
  --seed SEED           A number to use as a seed for the random number generator used by the training code (default: -1)
  --inference           Whether to run in Python inference mode (i.e. no training). Use with --resume to load a model trained with an existing run ID. (default: False)
  --base-port BASE_PORT
                        The starting port for environment communication. Each concurrent Unity environment instance will get assigned a port sequentially, starting from the base-port. Each instance will
                        use the port (base_port + worker_id), where the worker_id is sequential IDs given to each instance from 0 to (num_envs - 1). Note that when training using the Editor rather than an
                        executable, the base port will be ignored. (default: 5005)
  --num-envs NUM_ENVS   The number of concurrent Unity environment instances to collect experiences from when training (default: 1)
  --num-areas NUM_AREAS
                        The number of parallel training areas in each Unity environment instance. (default: 1)
  --debug               Whether to enable debug-level logging for some parts of the code (default: False)
  --env-args ...        Arguments passed to the Unity executable. Be aware that the standalone build will also process these as Unity Command Line Arguments. You should choose different argument names if
                        you want to create environment-specific arguments. All arguments after this flag will be passed to the executable. (default: None)
  --max-lifetime-restarts MAX_LIFETIME_RESTARTS
                        The max number of times a single Unity executable can crash over its lifetime before ml-agents exits. Can be set to -1 if no limit is desired. (default: 10)
  --restarts-rate-limit-n RESTARTS_RATE_LIMIT_N
                        The maximum number of times a single Unity executable can crash over a period of time (period set in restarts-rate-limit-period-s). Can be set to -1 to not use rate limiting with
                        restarts. (default: 1)
  --restarts-rate-limit-period-s RESTARTS_RATE_LIMIT_PERIOD_S
                        The period of time --restarts-rate-limit-n applies to. (default: 60)
  --torch               (Removed) Use the PyTorch framework. (default: False)
  --tensorflow          (Removed) Use the TensorFlow framework. (default: False)
  --results-dir RESULTS_DIR
                        Results base directory (default: results)
  --timeout-wait TIMEOUT_WAIT
                        The period of time to wait on a Unity environment to startup for training. (default: 60)

Engine Configuration:
  --width WIDTH         The width of the executable window of the environment(s) in pixels (ignored for editor training). (default: 84)
  --height HEIGHT       The height of the executable window of the environment(s) in pixels (ignored for editor training) (default: 84)
  --quality-level QUALITY_LEVEL
                        The quality level of the environment(s). Equivalent to calling QualitySettings.SetQualityLevel in Unity. (default: 5)
  --time-scale TIME_SCALE
                        The time scale of the Unity environment(s). Equivalent to setting Time.timeScale in Unity. (default: 20)
  --target-frame-rate TARGET_FRAME_RATE
                        The target frame rate of the Unity environment(s). Equivalent to setting Application.targetFrameRate in Unity. (default: -1)
  --capture-frame-rate CAPTURE_FRAME_RATE
                        The capture frame rate of the Unity environment(s). Equivalent to setting Time.captureFramerate in Unity. (default: 60)
  --no-graphics         Whether to run the Unity executable in no-graphics mode (i.e. without initializing the graphics driver. Use this only if your agents don't use visual observations. (default: False)

Torch Configuration:
  --torch-device DEVICE
                        Settings for the default torch.device used in training, for example, "cpu", "cuda", or "cuda:0" (default: None)
miguelalonsojr commented 10 months ago

This should be fixed on develop.

NicolasKuske commented 4 months ago

Hi, yes it is fixed in develop. I just tried it. Please update: ml-agents/docs/Installation.md