-
I don't exactly know how to reproduce this... but here's what it does
![2018-11-16 15 23 14](https://user-images.githubusercontent.com/15675877/48647896-65bf6b80-e9bb-11e8-815f-31d6308987e9.gif)
I…
-
Per Silke:
> XPP:SB3 sort of exists, but we typically run this camera in the DAQ. I believe we will be upgrading it to one of the new USB3 Allied cameras in the future for which no IOC eixsts now. …
-
This is a requirement for SB3 parity.
-
**Describe the bug**
using :
df = YahooDownloader(start_date = TRAIN_START_DATE,
end_date = TRADE_END_DATE,
ticker_list = config_tickers.DOW_30_TICKER).fe…
-
**TLDR**: [Petting Zoo](https://www.pettingzoo.ml/) has become the standard library for getting multi-agent environments & we want to support Petting Zoo's bindings in gym-microrts.
This project ht…
-
Our test~s~ just [caught a bug](https://github.com/PullJosh/sb-edit/pull/41#pullrequestreview-369207022), which is pretty neat! I'd like to reach a point where [automated tests](https://github.com/Pul…
-
### ❓ Question
Thank you very much for creating such an excellent tool. I am currently using the PPO algorithm in Stable-Baselines3 (SB3) for training in a custom environment. During this process, I …
-
![image](https://github.com/TurboWarp/desktop/assets/93743322/c474d44a-7198-412b-9bde-5ebf9ce924ec)
i got this. You want the .sb3 program ?
-
**Motivation**
Stable-baselines3 (SB3) has introduced support for action masking (see [here](https://sb3-contrib.readthedocs.io/en/master/modules/ppo_mask.html)), which is a great feature. However, t…
-
Hey, I'm trying to use SBX, specifically `DroQ`, with `MultiInputPolicy`. I get the error:
```
ValueError: Policy MultiInputPolicy unknown
```
Why is that? I thought SBX is compatible with SB3.