-
-
We batted this around in Gittr, but I'm getting pretty confident there's indeed a mem leak in DQNAgent (or a parent). At least something worth keeping an eye out for while y'all are working on the rew…
-
When I run `python examples/openai_gym_async.py Pong-ram-v0 -a VPGAgent -c examples/configs/vpg_agent.json -n examples/configs/vpg_network.json -w 3 -D` and tmux in with `tmux a -t openai_async`, all'…
-
Ok,
I compiled today the driver and run and up to date clinfo tool (full report below):
extensions differences vs Windows driver are:
Linux drv has (and Windows no):
cl_khr_priority_hints cl_khr…
-
Hi,I want to run some experiments for reproducing paper's results with tensorforce, mainly related to continuous control problem using PPO/TRPO. I tried different configurations many times, but the re…
-
Learning finished. Total episodes: 3000. Average reward of last 100 episodes: 17.76.
That doesn't look right.
-
Over email:
```
would you be willing to make the following additions to the VIS website to advertise the VIS dissertation award nominations?
1. add to the Important Dates page http://ieeevis.…
-
As you know, portfolio weights appear to static at test set.
Why didn't model learn so much?
I am assuming several reasons.
1. there are no ensemble in DDPG network
2. input doesn't include pr…
-
## Issue Report Template
### Tectonic Version
1.6.4
### Environment
AWS
### Expected Behavior
I am using the graphical installer and wish to use an existing VPC with a number of pre-exist…
-
Hi,
first of all, thanks for the hard work that is going into this project. You are saving me a ton of work.
Second, I encountered some strange behavior when trying to define an agent with multiple …