Open grizzlybearg opened 4 months ago
After further investigation, I've found out this error shows up if you use minibatch_size
as part of hyperparameter mutations dict with PBT or PB2. I'm not sure if this affects other search algorithms. #43467 would solve this
What happened + What you expected to happen
I'd like to customize my
APPOConfig
config dict, specifically theminibatch_size
attribute / param forAPPOConfig.training
. However, providing theminibatch_size
to thetraining
method returns the following error:I'd like clarification as to whether this is by design given that APPO subclasses Impala which has a
minibatch_size
property: https://github.com/ray-project/ray/blob/10009390b0ff61875b45cbab75052a89332b528e/rllib/algorithms/impala/impala.py#L455-#L466In addition to this,
minibatch_size
is also a parameter ofImpalaConfig.training
method, so I can't tell why theminibatch_size
kwargs key is not passed to theImpalaConfig.training
method as shown in https://github.com/ray-project/ray/blob/10009390b0ff61875b45cbab75052a89332b528e/rllib/algorithms/appo/appo.py#L196Versions / Dependencies
Ray 2.9.3 Python 3.11.7
Reproduction script
`class HPRanges:
class APPOLearnerHPs:
`
Issue Severity
Medium: It is a significant difficulty but I can work around it.