-
I used Keras-RL DDPG agent and trained it overnight.
At the start, with randomly un-initialized weights, I can see screen and the car, barely move as it is expected. Then I trained it overnight.
Bu…
-
Trying to work with window lengts greater than 1, e.g.
```
WINDOW_LENGTH = 4
model = Sequential()
model.add(Flatten(input_shape=(WINDOW_LENGTH,7,7,512)))
model.add(Dense(1024))
model.ad…
-
When I try to render video on my computer Python often crashes, and then I am not able to successfully record the video it seems. The car either does not move, or the screen is black. Is there a way t…
-
Hi
I am using this code to monitor my training process but I encountered an error when using the following code. I do not know if this is a bug or I am using Monitor class wrong.
`def __init(...…
-
In keras-rl there are checks such as
```
if hasattr(actor.output, '__len__') and len(actor.output) > 1:
raise ValueError('Actor "{}" has more than one output. DDPG expects an actor that has a s…
-
Hi,
How would one save a trained agent (say, ddpg)? Thank you,
-
@ludc and @korymath are interested in building out some RL algorithms and doing OpenAI Gym integration.
Kory from his repo: https://github.com/korymath/examples/tree/master/rl hasn't yet started on…
-
in Humanoid-v1 env; also error this when run some time,if STEPS smaller , the error appear is later
[2016-05-28 10:53:54,309] Starting new video recorder writing to /Users/lmj/Documents/t/ddpgout/1/…
-
Hey, just made a [submission](https://gym.openai.com/evaluations/eval_jCJs99SpSjjoya6MTkg) for InvertedDoublePendulum but results are weird/wrong:
Only one video is displayed and it is falsely linked…
-
When i use ddpg algorithm, I set `plot=True`. But the evaluation run after each iteration did't appear.
So what's the problem?