Closed etienne87 closed 7 years ago
In ThreadTrainer.py, I don't understand how the following lines are supposed to affect the batch size :
np.concatenate((x__, x_)) np.concatenate((r__, r_)) np.concatenate((a__, a_))
np.concatenate returns the merged array, but does not affect x_ or x.
However, I do measure the TPS to drops. What sorcery is this ?
[Time: 404] [Episode: 213 Score: -1.0642] [RScore: 7.5345 RPPS: 281] [PPS: 282 TPS: 4] [NT: 2 NP: 3 NA: 4]
(The PPS/ TPS is overall low in my case because the game is a costly one running on remote desktop)
EDIT : i suggest to modify to :
x__ = np.concatenate((x__, x_)) r__ = np.concatenate((r__, r_)) a__ = np.concatenate((a__, a_))
but this does not affect TPS compared to other
yeap, that's a bug right there! thanks for noticing it. that's what happens when we don't test all the configurations. please submit a pull request if you can. otherwise I will go ahead and fix it.
okey dokey
In ThreadTrainer.py, I don't understand how the following lines are supposed to affect the batch size :
np.concatenate returns the merged array, but does not affect x_ or x.
However, I do measure the TPS to drops. What sorcery is this ?
(The PPS/ TPS is overall low in my case because the game is a costly one running on remote desktop)
EDIT : i suggest to modify to :
but this does not affect TPS compared to other