Open Seraphaious opened 1 year ago
ERL_PARAMS = {"learning_rate": 2e-6,"batch_size": 4096,"gamma": 0.99, "seed":312,"net_dimension":[256, 128], "target_step":10000, "eval_gap":20, "eval_times":1}
Every time I run the training and backtest the results, they are always entirely different. If the seed is the same, should they not be consistent?
ERL_PARAMS = {"learning_rate": 2e-6,"batch_size": 4096,"gamma": 0.99, "seed":312,"net_dimension":[256, 128], "target_step":10000, "eval_gap":20, "eval_times":1}
Every time I run the training and backtest the results, they are always entirely different. If the seed is the same, should they not be consistent?