issues
search
EmbersArc
/
PPO
PPO implementation for OpenAI gym environment based on Unity ML Agents
148
stars
21
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
How to load the trained model
#6
Eulerianial
opened
5 years ago
0
No registered env with id: RocketLander-v0
#5
ksajan
opened
5 years ago
5
Probability of Action
#4
Datoclement
opened
6 years ago
1
Errors when running ppo.py : Failed on kwargs for b2PolygonShape.vertices: count >= 3
#3
timelong
closed
6 years ago
0
Errors when running ppo.y : Failed on kwargs for b2PolygonShape.vertices: count >= 3
#2
timelong
closed
6 years ago
0
Errors when running PPO:b2PolygonShape.vertices: count >= 3
#1
timelong
closed
6 years ago
0