-
`example/recorders/botaction_recorder.py` states that
> This is a small server that accepts connections on a websocket port and writes it to a file.
>
> The purpose is to allow a universe-env wi…
-
http://arxiv.org/pdf/1602.01783v1.pdf describes asynchronous methods using off policy (1 step /n step Q learning) and even on policy (sarsa and advantage actor-critic (A3C)) reinforcement learning.
T…
-
Executing initial steps "to get started first install universe" in readme
### Expected behavior
installation should be successfull
### Actual behavior
name@name-System-Product-Name:~/source$ g…
ohhmm updated
7 years ago
-
(First, please check https://github.com/openai/universe/wiki/Solutions-to-common-problems for solutions to many common problems)
### Expected behavior
```
import gym
import universe
import rand…
-
(First, please check https://github.com/openai/universe/wiki/Solutions-to-common-problems for solutions to many common problems)
### Expected behavior
I have run it before and it successfully load…
-
### Expected behavior
connect to vnc on port 5900 etc. and see core Atari game running
### Actual behavior
game screen tears in between frames for seconds at a time: game/black/game/black/game/ga…
ghost updated
7 years ago
-
At first I could execute `import universe`,
but had the [`No registered env with id: flashgames.DuskDrive-v0`](https://github.com/openai/universe/issues/37) issue.
I found that I didn't setup docke…
-
Hi,
I just found your project on nvidia blog.
Why do not integrate it in https://github.com/openai/universe?
-
Currently, `ClientAgent` is the most basic agent class, but it's simply too coupled with the Pac-Man simulator. For instance, it inherits from `BerkeleyGameAgent`, implements the `getAction` method, c…
-
I'm a little confused about the meaning of the variable `target`, which is the argument of the following two functions:
1) `VRClassReward:updateOutput(input, target)`
2) `VRClassReward:updateGradInpu…