-
A lot of research in the field of RL is being done now days.
I thought it can be both interesting and productive to have a post that would bring new research from time to time that might be relevant …
-
Thank you so much for this great project.
When i try to run ddqn_rl_trader.py on windows (my computer has no GPU, so i use LSTM instead of CuDNNLSTM), i get the following errors:
2019-01-17 17:0…
-
Spoke Kevin Day who runs an impact investment fund in Canada with a limited pool of investors.
As we are very far away from building securities trading capabilities, it might be appropriate to put th…
zynis updated
10 years ago
-
### babi
This suggestion came up yesterday in chat with celtic, we had this feature before that allowed to interact with another player to give items. Example: Omg I have lost my orb! a player comes …
-
Hello, this seems like a nice work.
I have some questions though.
When you're comparing on the test data what does refer the market value?
In your perspective do you think this is a good approach?
…
-
12.2.4 it is said that if a route may not be entered the entry point (closest location) of the route shall be seen as EOA. How is it guaranteed that this location is not beyond the EOA given by the MA…
-
When trying to run the example I get this error:
> While evaluating the form starting at line 56, column 0
> of #P"/home/jvrssc/src/lisp/trading-core/examples/backtesting-simulation.lisp":
>
…
-
Raising [this issue](https://github.com/keras-team/keras/issues/15887) again here as it still seems to be present in the current code base.
The data in my model is tiny (python train_models.py
M…
-
An agent trained via ElegantRL, given the same input state, outputs different and even seemingly random actions each time of predictions.
Shouldn't an agent output deterministic actions after learnin…
-
Several issues here:
- balance is "too low", but the agent is still working
- "to run your agent", but it says the agent is working
- $2 is too low – the app will immediately transfer this to t…