Closed scottastone closed 2 years ago
Hi, this is my environment:
Ended up using gcc/9.1.0 Needs to use gcc >8
module load gcc/9.1.0 binutils/2.28 cmake/3.21.1 git python/3.7.7
Something is making part of cmake use the default /usr/bin/gcc which is 4.4 and so add this:
export CC=gcc
export CXX=g++
(Can’t be bothered to figure out why. Seems common on this system. By setting CC and CXX as above it should just use the one in the PATH).
This is just convenience in our environment.
export LDFLAGS="$LDFLAGS -Wl,-rpath,/usr/local/gcc/9.1.0/lib64"
Also I git cloned the repo (didn’t down load v2.2.0.tar.gz)
git clone https://github.com/deepmind/mujoco.git
From: Scott Stone @.> Sent: Wednesday, 13 July 2022 1:25 PM To: deepmind/mujoco @.> Cc: berniekirby @.>; Mention @.> Subject: [deepmind/mujoco] Python bindings won't compile - complaints of #pragma (Issue #379)
Hi there
I'm trying to use MuJoCo on a cluster (Compute Canada if it makes any difference). I am trying to compile the Python bindings on MuJoCo 2.2.0. Amazingly they don't have the precompiled bindings on the server, but they do have the MuJoCo binaries.
gcc is 11.3.0
My workflow is as follows:
$ module load StdEnv/2020 # note this just loads the compiler, etc
$ module load mujoco/2.2.0
$ export MUJOCO_PATH=/cvmfs/soft.computecanada.ca/easybuild/software/2020/avx2/Core/mujoco/2.2.0
$ wget https://github.com/deepmind/mujoco/archive/refs/tags/2.2.0.zip
$ unzip 2.2.0.zip
$ cd mujoco-2.2.0/python
$ virtualenv tmp/env
$ source tmp/env/bin/activate
$ ./make_sdist.sh
$ pip wheel mujoco-2.2.0.tar.gz
I came across this issue https://github.com/deepmind/mujoco/issues/354 , but it wasn't obvious how @berniekirby https://github.com/berniekirby solved the issue. I've tried removing everything, etc. I'm just trying to get this working with dm_control. Any help would be appreciated.
I end up getting a bunch of output like below:
….. lots deleted.
Thanks for the reply and suggestions @berniekirby. I tried adding export CC=gcc
and export CXX=g++
(as on this cluster, the default /usr/bin/gcc
is 4.8.5) to no avail. Seems to just produce the same error.
I actually also have a local machine that I've tried this build process on (running Ubuntu 22.04 & gcc 11.2.0) and when compiling I actually get the same error. So I am scratching my head even more.
I'll keep plugging away and see if something works. If someone from the MuJoCo team sees an obvious flaw, I'd super appreciate it!
@scottastone Are you not able to install via pip
?
@saran-t Even if I try to install via pip
I still get the build error. Worse though, if I deploy the job the nodes do not have access to the internet. My solution was if I could at least build the wheel then I could deploy it with the job so it could install the local copy.
I should also mention that the cluster doesn't seem to support manylinux2014
wheels.
Just another update
I figured out the OS is CentOS 7, which should support manylinux2014
but doesn't seem to.
$ cat /etc/os-release
NAME="CentOS Linux"
VERSION="7 (Core)"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="7"
PRETTY_NAME="CentOS Linux 7 (Core)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:7"
HOME_URL="https://www.centos.org/"
BUG_REPORT_URL="https://bugs.centos.org/"
CENTOS_MANTISBT_PROJECT="CentOS-7"
CENTOS_MANTISBT_PROJECT_VERSION="7"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="7"
Thought I'd try building on a different machine and deploying the wheel, but no success yet:
$ pip install mujoco-2.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Ignoring pip: markers 'python_version < "3"' don't match your environment
ERROR: mujoco-2.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl is not a supported wheel on this platform.
Could you please paste the full output of pip debug --verbose 2> /dev/null | grep cp3
?
$ pip debug --verbose 2> /dev/null | grep cp3
cp39-cp39-linux_x86_64
cp39-abi3-linux_x86_64
cp39-none-linux_x86_64
cp38-abi3-linux_x86_64
cp37-abi3-linux_x86_64
cp36-abi3-linux_x86_64
cp35-abi3-linux_x86_64
cp34-abi3-linux_x86_64
cp33-abi3-linux_x86_64
cp32-abi3-linux_x86_64
cp39-none-any
This is very strange... somehow pip decides that your system doesn't conform to any of the manylinux
requirement, not even the oldest manylinux1
, which means that you won't be able to install anything via prebuilt wheels.
Is there some way that I can make a VM with the same environment as what you're using?
To try and answer you original question: the pragma in question most definitely comes from https://github.com/deepmind/mujoco/blob/main/python/mujoco/util/crossplatform.h#L54
As you can see that pragma is supposed to only be there when the __clang__
macro is defined, which implies that for some reason your GCC is pretending to be Clang.
Very interesting - I can try to get in contact with the cluster engineers to get a better understanding of what is going on, but they are quite slow to help. If you think it would be worthwhile, I'm happy to grant you access to the cluster as I'm not sure of the best way to create an effective VM clone.
Also: I wanted to note that I don't think the error of gcc
being confused for clang
is unique to this machine. I have a local server running Ubuntu 22.04 and it makes the same mistake.
Update:
I was able to successfully build mujoco by using clang 13.0.1
. However, when I go to import mujoco
in python, I get the following error with glfw:
(env) [stone@cedar1 dist]$ python
Python 3.9.6 (default, Jul 12 2021, 18:23:59)
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import mujoco
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/project/6029978/stone/work/mujoco-2.2.0/python/tmp/env/lib/python3.9/site-packages/mujoco/__init__.py", line 54, in <module>
from mujoco.glfw import GLContext
File "/project/6029978/stone/work/mujoco-2.2.0/python/tmp/env/lib/python3.9/site-packages/mujoco/glfw/__init__.py", line 17, in <module>
import glfw
File "/project/6029978/stone/work/mujoco-2.2.0/python/tmp/env/lib/python3.9/site-packages/glfw/__init__.py", line 43, in <module>
raise ImportError("Failed to load GLFW3 shared library.")
ImportError: Failed to load GLFW3 shared library.
I have tried to module load glfw/3.3.2
and it didn't seem to make a difference, and I also tried to build from source but have not been successful yet, but maybe this isn't the rabbit hole to be going down.
What's the GPU installed on your cluster? If it's Nvidia then you're better off with EGL: set the environment variable MUJOCO_GL=egl
and see if that works.
Alternatively, if you don't need rendering (or just want to test that your base installation works) then set MUJOCO_GL=off
.
Did you ever figure out what happened with GCC pretending to be Clang? I'm curious to find out the root cause there.
No idea about the gcc
thinking it's clang
issue - I just know this issue persists on my local Ubuntu installation as well.
There is no GPU (as far as I know)- our model we are running is going to be A3C, so we're just running CPU-only nodes. And we're fine to go renderless.
Just to throw another wrinkle in - the entire reason I am trying to do this is because we are actually using dm_control
for our environment - which at the top of our script we put in a from dm_control import mujoco
Now, when I run the above, it gives me this fun error, suggesting that I cannot set MUJOCO_GL
to off
.
>>> from dm_control import mujoco
Traceback (most recent call last):
File "/project/6029978/stone/work/env/lib/python3.9/site-packages/dm_control/_render/__init__.py", line 60, in <module>
import_func = _ALL_RENDERERS[BACKEND]
KeyError: 'off'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/project/6029978/stone/work/env/lib/python3.9/site-packages/dm_control/mujoco/__init__.py", line 18, in <module>
from dm_control.mujoco.engine import action_spec
File "/project/6029978/stone/work/env/lib/python3.9/site-packages/dm_control/mujoco/engine.py", line 41, in <module>
from dm_control import _render
File "/project/6029978/stone/work/env/lib/python3.9/site-packages/dm_control/_render/__init__.py", line 62, in <module>
raise RuntimeError(
RuntimeError: Environment variable MUJOCO_GL must be one of odict_keys(['glfw', 'egl', 'osmesa']): got 'off'.
Also, there seems to be a difference between mujoco
and the one wrapped by dm_control
--- there is no class called mujoco.Physics
in regular mujoco, but there is in the dm_control
version.
However, if I just do a simple import mujoco
it no longer complains. Sigh.
This is a dm_control
bug, I'll fix that for our next release. Sorry about that.
For the time being, if you're able to install OSMesa, the easiest way to proceed is to install it and set MUJOCO_GL=osmesa
.
Awesome, that seems to have fixed it. I am now able to load and run our code no problems (sans rendering, but that's ok).
I'm wondering if I should leave this issue open though, because there is something that seems to trick gcc
into thinking it's clang
. A decent work around is to just use clang
, but that might not work for everyone. I can do a bit of a deeper dive and see if I can figure out what's going on
With OSMesa you'd actually be able to render correctly as well, just very slowly (since everything is done on the CPU).
I've never actually seen GCC pretending to be Clang before, so it might be something specific to your cluster. Perhaps it's worth asking your sysadmin?
I will ask the sysadmin about it, but I am also getting it on a local Ubuntu machine when I try to compile the bindings, so there may be an issue in the CMake files
You can see our GCC 11 build logs on GH Actions here for example.
If you're interested, you can fork this repo, and edit build.yml to use the same GCC and Ubuntu versions as (or as close as possible to) your machine to see if you can repro the issue that way. If you can, that'll give us a full log to look at.
@scottastone I've added more GitHub Actions workflows yesterday so now we're also building on Ubuntu 20 with GCC 9 and 10. I can't reproduce your original problem on any of these platforms, so I'm now 99% sure that it's something on your end.
Could you please make a new, clean build directory, run cmake
, and post the full configuration output?
Sure - do you want the cmake
output for the Mujoco build or just the bindings - because I'm not sure how to run cmake
on just the bindings
Actually, the easiest thing to do would be to add -v
to your pip wheel
call. I really just want to see the first couple of lines from CMake (compiler identification lines).
I assume this is what you're looking for:
I'm running this on my local machine (Ubuntu 22.04) using Python 3.10.4
gcc --version
output:
gcc (Ubuntu 11.2.0-19ubuntu1) 11.2.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
(env) scott@frankenstein:~/code/sources/tmp/mujoco/python/dist$ pip wheel mujoco-2.2.0.tar.gz -v
Processing ./mujoco-2.2.0.tar.gz
File was already downloaded /home/scott/code/sources/tmp/mujoco/python/dist/mujoco-2.2.0.tar.gz
Running command python setup.py egg_info
running egg_info
creating /tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info
writing /tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/PKG-INFO
writing dependency_links to /tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/dependency_links.txt
writing requirements to /tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/requires.txt
writing top-level names to /tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/top_level.txt
writing manifest file '/tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/SOURCES.txt'
reading manifest file '/tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
adding license file 'LICENSE'
writing manifest file '/tmp/pip-pip-egg-info-22qpycqy/mujoco.egg-info/SOURCES.txt'
Preparing metadata (setup.py) ... done
Collecting absl-py
Using cached absl_py-1.1.0-py3-none-any.whl (123 kB)
Collecting glfw
Using cached glfw-2.5.3-py2.py27.py3.py30.py31.py32.py33.py34.py35.py36.py37.py38-none-manylinux2014_x86_64.whl (206 kB)
Collecting numpy
Using cached numpy-1.23.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.0 MB)
Collecting pyopengl
Using cached PyOpenGL-3.1.6-py3-none-any.whl (2.4 MB)
Saved ./absl_py-1.1.0-py3-none-any.whl
Saved ./glfw-2.5.3-py2.py27.py3.py30.py31.py32.py33.py34.py35.py36.py37.py38-none-manylinux2014_x86_64.whl
Saved ./numpy-1.23.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Saved ./PyOpenGL-3.1.6-py3-none-any.whl
Building wheels for collected packages: mujoco
Running command python setup.py bdist_wheel
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-310
creating build/lib.linux-x86_64-cpython-310/mujoco
copying mujoco/bindings_test.py -> build/lib.linux-x86_64-cpython-310/mujoco
copying mujoco/rollout_test.py -> build/lib.linux-x86_64-cpython-310/mujoco
copying mujoco/__init__.py -> build/lib.linux-x86_64-cpython-310/mujoco
copying mujoco/rollout.py -> build/lib.linux-x86_64-cpython-310/mujoco
copying mujoco/render_test.py -> build/lib.linux-x86_64-cpython-310/mujoco
creating build/lib.linux-x86_64-cpython-310/mujoco/glfw
copying mujoco/glfw/__init__.py -> build/lib.linux-x86_64-cpython-310/mujoco/glfw
creating build/lib.linux-x86_64-cpython-310/mujoco/osmesa
copying mujoco/osmesa/__init__.py -> build/lib.linux-x86_64-cpython-310/mujoco/osmesa
creating build/lib.linux-x86_64-cpython-310/mujoco/egl
copying mujoco/egl/__init__.py -> build/lib.linux-x86_64-cpython-310/mujoco/egl
copying mujoco/egl/egl_ext.py -> build/lib.linux-x86_64-cpython-310/mujoco/egl
running build_ext
Configuring CMake with the following arguments:
-DPython3_ROOT_DIR:PATH=/home/scott/code/sources/tmp/mujoco/python/tmp/env
-DPython3_EXECUTABLE:STRING=/home/scott/code/sources/tmp/mujoco/python/tmp/env/bin/python
-DCMAKE_MODULE_PATH:PATH=/tmp/pip-req-build-79j72h9v/cmake
-DCMAKE_BUILD_TYPE:STRING=Release
-DCMAKE_LIBRARY_OUTPUT_DIRECTORY:PATH=build/temp.linux-x86_64-cpython-310
-DCMAKE_INTERPROCEDURAL_OPTIMIZATION=:BOOLON
-DCMAKE_Fortran_COMPILER:STRING=
-DCMAKE_VERBOSE_MAKEFILE:BOOL=ON
-DBUILD_TESTING:BOOL=OFF
-DMUJOCO_LIBRARY_DIR:PATH=/opt/mujoco-2.2.0/lib
-DMUJOCO_INCLUDE_DIR:PATH=/opt/mujoco-2.2.0/include
-DPython3_LIBRARY=/usr/lib/python3.10
-DPython3_INCLUDE_DIR=/usr/include/python3.10
-- The C compiler identification is GNU 11.2.0
-- The CXX compiler identification is GNU 11.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test SUPPORTS_LLD
-- Performing Test SUPPORTS_LLD - Success
-- Performing Test SUPPORTS_GC_SECTIONS
-- Performing Test SUPPORTS_GC_SECTIONS - Success
-- Found Python3: /home/scott/code/sources/tmp/mujoco/python/tmp/env/bin/python (found version "3.10.4") found components: Interpreter Development Development.Module Development.Embed
MuJoCo is at /opt/mujoco-2.2.0/lib/libmujoco.so
MuJoCo headers are at /opt/mujoco-2.2.0/include
-- mujoco::FindOrFetch: checking for targets in package `absl`
-- mujoco::FindOrFetch: checking for targets in package `absl` - target `absl::core_headers` not defined.
-- mujoco::FindOrFetch: Using FetchContent to retrieve `abseil-cpp`
CMake Warning at /tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310/_deps/abseil-cpp-src/CMakeLists.txt:70 (message):
A future Abseil release will default ABSL_PROPAGATE_CXX_STD to ON for CMake
3.8 and up. We recommend enabling this option to ensure your project still
builds correctly.
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- mujoco::FindOrFetch: Using FetchContent to retrieve `abseil-cpp` - Done
-- mujoco::FindOrFetch: checking for targets in package `Eigen3`
-- mujoco::FindOrFetch: checking for targets in package `Eigen3` - target `Eigen3::Eigen` not defined.
-- mujoco::FindOrFetch: Using FetchContent to retrieve `eigen`
-- Performing Test EIGEN_COMPILER_SUPPORT_CPP11
-- Performing Test EIGEN_COMPILER_SUPPORT_CPP11 - Success
-- Performing Test COMPILER_SUPPORT_std=cpp03
-- Performing Test COMPILER_SUPPORT_std=cpp03 - Success
-- Performing Test standard_math_library_linked_to_automatically
-- Performing Test standard_math_library_linked_to_automatically - Success
-- Standard libraries to link to explicitly: none
-- Performing Test COMPILER_SUPPORT_WERROR
-- Performing Test COMPILER_SUPPORT_WERROR - Success
-- Performing Test COMPILER_SUPPORT_pedantic
-- Performing Test COMPILER_SUPPORT_pedantic - Success
-- Performing Test COMPILER_SUPPORT_Wall
-- Performing Test COMPILER_SUPPORT_Wall - Success
-- Performing Test COMPILER_SUPPORT_Wextra
-- Performing Test COMPILER_SUPPORT_Wextra - Success
-- Performing Test COMPILER_SUPPORT_Wundef
-- Performing Test COMPILER_SUPPORT_Wundef - Success
-- Performing Test COMPILER_SUPPORT_Wcastalign
-- Performing Test COMPILER_SUPPORT_Wcastalign - Success
-- Performing Test COMPILER_SUPPORT_Wcharsubscripts
-- Performing Test COMPILER_SUPPORT_Wcharsubscripts - Success
-- Performing Test COMPILER_SUPPORT_Wnonvirtualdtor
-- Performing Test COMPILER_SUPPORT_Wnonvirtualdtor - Success
-- Performing Test COMPILER_SUPPORT_Wunusedlocaltypedefs
-- Performing Test COMPILER_SUPPORT_Wunusedlocaltypedefs - Success
-- Performing Test COMPILER_SUPPORT_Wpointerarith
-- Performing Test COMPILER_SUPPORT_Wpointerarith - Success
-- Performing Test COMPILER_SUPPORT_Wwritestrings
-- Performing Test COMPILER_SUPPORT_Wwritestrings - Success
-- Performing Test COMPILER_SUPPORT_Wformatsecurity
-- Performing Test COMPILER_SUPPORT_Wformatsecurity - Success
-- Performing Test COMPILER_SUPPORT_Wshorten64to32
-- Performing Test COMPILER_SUPPORT_Wshorten64to32 - Failed
-- Performing Test COMPILER_SUPPORT_Wlogicalop
-- Performing Test COMPILER_SUPPORT_Wlogicalop - Success
-- Performing Test COMPILER_SUPPORT_Wenumconversion
-- Performing Test COMPILER_SUPPORT_Wenumconversion - Success
-- Performing Test COMPILER_SUPPORT_Wcpp11extensions
-- Performing Test COMPILER_SUPPORT_Wcpp11extensions - Failed
-- Performing Test COMPILER_SUPPORT_Wdoublepromotion
-- Performing Test COMPILER_SUPPORT_Wdoublepromotion - Success
-- Performing Test COMPILER_SUPPORT_Wshadow
-- Performing Test COMPILER_SUPPORT_Wshadow - Success
-- Performing Test COMPILER_SUPPORT_Wnopsabi
-- Performing Test COMPILER_SUPPORT_Wnopsabi - Success
-- Performing Test COMPILER_SUPPORT_Wnovariadicmacros
-- Performing Test COMPILER_SUPPORT_Wnovariadicmacros - Success
-- Performing Test COMPILER_SUPPORT_Wnolonglong
-- Performing Test COMPILER_SUPPORT_Wnolonglong - Success
-- Performing Test COMPILER_SUPPORT_fnochecknew
-- Performing Test COMPILER_SUPPORT_fnochecknew - Success
-- Performing Test COMPILER_SUPPORT_fnocommon
-- Performing Test COMPILER_SUPPORT_fnocommon - Success
-- Performing Test COMPILER_SUPPORT_fstrictaliasing
-- Performing Test COMPILER_SUPPORT_fstrictaliasing - Success
-- Performing Test COMPILER_SUPPORT_wd981
-- Performing Test COMPILER_SUPPORT_wd981 - Failed
-- Performing Test COMPILER_SUPPORT_wd2304
-- Performing Test COMPILER_SUPPORT_wd2304 - Failed
-- Performing Test COMPILER_SUPPORT_STRICTANSI
-- Performing Test COMPILER_SUPPORT_STRICTANSI - Failed
-- Performing Test COMPILER_SUPPORT_Qunusedarguments
-- Performing Test COMPILER_SUPPORT_Qunusedarguments - Failed
-- Performing Test COMPILER_SUPPORT_ansi
-- Performing Test COMPILER_SUPPORT_ansi - Success
-- Performing Test COMPILER_SUPPORT_OPENMP
-- Performing Test COMPILER_SUPPORT_OPENMP - Success
qmake: could not exec '/usr/lib/qt5/bin/qmake': No such file or directory
-- Found unsuitable Qt version "" from NOTFOUND
qmake: could not exec '/usr/lib/qt5/bin/qmake': No such file or directory
-- Found unsuitable Qt version "" from NOTFOUND
-- Qt4 not found, so disabling the mandelbrot and opengl demos
-- Could NOT find CHOLMOD (missing: CHOLMOD_INCLUDES CHOLMOD_LIBRARIES)
-- Could NOT find UMFPACK (missing: UMFPACK_INCLUDES UMFPACK_LIBRARIES)
-- Could NOT find KLU (missing: KLU_INCLUDES KLU_LIBRARIES)
-- Could NOT find SuperLU (missing: SUPERLU_INCLUDES SUPERLU_LIBRARIES SUPERLU_VERSION_OK) (Required is at least version "4.0")
-- Checking for one of the modules 'hwloc'
-- Performing Test HAVE_HWLOC_PARENT_MEMBER
-- Performing Test HAVE_HWLOC_PARENT_MEMBER - Success
-- Performing Test HAVE_HWLOC_CACHE_ATTR
-- Performing Test HAVE_HWLOC_CACHE_ATTR - Success
-- Performing Test HAVE_HWLOC_OBJ_PU
-- Performing Test HAVE_HWLOC_OBJ_PU - Success
-- Looking for hwloc_bitmap_free in hwloc
-- Looking for hwloc_bitmap_free in hwloc - found
-- A version of Pastix has been found but pastix_nompi.h does not exist in the include directory. Because Eigen tests require a version without MPI, we disable the Pastix backend.
--
-- Configured Eigen 3.4.0
--
-- Available targets (use: make TARGET):
-- ---------+--------------------------------------------------------------
-- Target | Description
-- ---------+--------------------------------------------------------------
-- install | Install Eigen. Headers will be installed to:
-- | <CMAKE_INSTALL_PREFIX>/<INCLUDE_INSTALL_DIR>
-- | Using the following values:
-- | CMAKE_INSTALL_PREFIX: /usr/local
-- | INCLUDE_INSTALL_DIR: include/eigen3
-- | Change the install location of Eigen headers using:
-- | cmake . -DCMAKE_INSTALL_PREFIX=yourprefix
-- | Or:
-- | cmake . -DINCLUDE_INSTALL_DIR=yourdir
-- doc | Generate the API documentation, requires Doxygen & LaTeX
-- blas | Build BLAS library (not the same thing as Eigen)
-- uninstall| Remove files installed by the install target
-- ---------+--------------------------------------------------------------
--
-- mujoco::FindOrFetch: Using FetchContent to retrieve `eigen` - Done
-- mujoco::FindOrFetch: checking for targets in package `pybind11`
-- mujoco::FindOrFetch: checking for targets in package `pybind11` - target `pybind11::pybind11_headers` not defined.
-- mujoco::FindOrFetch: Using FetchContent to retrieve `pybind11`
-- pybind11 v2.10.0 dev1
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- mujoco::FindOrFetch: Using FetchContent to retrieve `pybind11` - Done
-- Performing Test CAN_BUILD_AVX
-- Performing Test CAN_BUILD_AVX - Success
-- Configuring done
-- Generating done
-- Build files have been written to: /tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310
Building all extensions with CMake
/usr/bin/cmake -S/tmp/pip-req-build-79j72h9v/mujoco -B/tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310 --check-build-system CMakeFiles/Makefile.cmake 0
/usr/bin/cmake -E cmake_progress_start /tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310/CMakeFiles /tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310//CMakeFiles/progress.marks
/usr/bin/gmake -f CMakeFiles/Makefile2 all
gmake[1]: Entering directory '/tmp/pip-req-build-79j72h9v/build/temp.linux-x86_64-cpython-310'
/usr/bin/gmake -f _deps/abseil-cpp-build/absl/base/CMakeFiles/absl_log_severity.dir/build.make _deps/abseil-cpp-build/absl/base/CMakeFiles/absl_log_severity.dir/depend
/usr/bin/gmake -f _deps/abseil-cpp-build/absl/base/CMakeFiles/absl_spinlock_wait.dir/build.make _deps/abseil-cpp-build/absl/base/CMakeFiles/absl_spinlock_wait.dir/depend
/usr/bin/gmake -f _deps/abseil-cpp-build/absl/time/CMakeFiles/absl_time_zone.dir/build.make _deps/abseil-cpp-build/absl/time/CMakeFiles/absl_time_zone.dir/depend
/usr/bin/gmake -f _deps/abseil-cpp-build/absl/numeric/CMakeFiles/absl_int128.dir/build.make _deps/abseil-cpp-build/absl/numeric/CMakeFiles/absl_int128.dir/depend
/usr/bin/gmake -f _deps/abseil-cpp-build/absl/profiling/CMakeFiles/absl_exponential_biased.dir/build.make _deps/abseil-cpp-build/absl/profiling/CMakeFiles/absl_exponential_biased.dir/depend
I'm going to close this issue since you've found a working solution and we've since added multiple GCC builds to our CI setup and they all seem fine.
Hi there
I'm trying to use MuJoCo on a cluster (Compute Canada if it makes any difference). I am trying to compile the Python bindings for MuJoCo 2.2.0. Amazingly, they don't have the precompiled bindings on the server, but they do have the MuJoCo binaries.
gcc
is 11.3.0My workflow is as follows:
I came across this issue, but it wasn't obvious how @berniekirby solved the issue. I've tried removing everything, etc. I'm just trying to get this working with
dm_control
. Any help would be appreciated.I end up getting a bunch of output like below: