Closed LYK-love closed 7 months ago
Hi @LYK-love, this is intended right now, but i agree that the info is missing in the README. Can you try to install the env like this:
conda create -n sheeprl python=3.9
conda activate sheeprl
git clone git@github.com:Eclectic-Sheep/sheeprl.git
cd sheeprl
pip install .
pip install swig
pip install .\[atari,mujoco,dev,test,box2d\]
Hi @LYK-love, we modified the default packages of SheepRL because of some problems in the execution of the GitHub actions and because we wanted a cleaner environment when we integrated SheepRL into DIAMBRA.
Now, to install box2d environments, you have to install manually as @belerico said. We will modify the table of supported environments in the README as soon as possible by adding the following two instructions for the box2d installation:
pip install swig
pip install .[box2d]
Hello, when I try to install sheeprl from scratch, I found that even when I have followed your instructions to install the dependencies, I still have to install
gymnasium[box2d]
to run box2d envs.Here's my workflow: First,
Next I try to train DreamerV3 on box2d CarRacing:
I got error:
So I have to