-
hi,
i tried out this project and it is one of the few that actually works off the shelf, thank you for your work.
Is there a way to enable self play when training an agent? My usecase is to use Dr…
-
version 1.5.10 i cant use the self created sounds since the last two updates. i can't even select them.
transmitter X20s
adfe1 updated
4 weeks ago
-
### Describe the feature
I'm using this snippet of code to playback a wav file to the caller using pjsua2 `AudioMediaPlayer`
How to send a chunk by chunk (audio frames) to the caller instead of pl…
-
### What happened + What you expected to happen
The example script **self_play_league_based_with_open_spiel.py** found [**here**](https://github.com/ray-project/ray/blob/master/rllib/examples/multi_a…
-
I tried to install version v.1.20.0 using docker following the instructions in the contrib folder, but after installation the player container failed to start. I got the following errors:
```
npm ER…
-
## How can Bevy's documentation be improved?
```rs
/// Returns true if the animation is currently playing or paused, or false
/// if the animation is stopped.
pub fn animation_is_playing(&self, …
-
I have a the following issue with 2024.7.2. everything was working with 2024.6.4, but now my DNLA server works but streaming from radio-browser no longer works after working for quite some time. I e…
-
Hi, I just read and reproduced your code. It's so good and easy to understand.
But I have a question that maybe we should use the best model as the self play agent to generate memory?
If our trainin…
-
Self play, and generally multi-LM-agent settings are something we are very interested in exploring. What does it take to support this? Does it already work without big overheads?
-
### What happened + What you expected to happen
Hi, I am using a self-play scheme on SImple_tag_v2 of Pettingzoo, that works on a previous installation of ray_300_dev0 and al old ray 1.2.0 (with modi…