Open MHJYR opened 2 years ago
Hi, Can you give a full command line and a stack trace? Also, what version are you using? Thanks
Hi bartokg,
yes of course, apologies. Version tf-agents: 0.12.0. This version does not seem to include BanditReplayBuffer. I changed line 44-46 to point to the mushroom data, i.e.,
flags.DEFINE_string( 'mushroom_csv', 'C:/mushroom_csv.csv', 'Location of the csv file containing the mushroom dataset.')
Stack trace for the first problem:
runcell(0, 'C:/Users/python/agents3/tf_agents/bandits/agents/examples/v2/train_eval_mushroom.py')
2022-04-07 17:42:17.596199: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'cudart64_110.dll'; dlerror: cudart64_110.dll not found
2022-04-07 17:42:17.596452: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
Traceback (most recent call last):
File "C:\Users\python\agents3\tf_agents\bandits\agents\examples\v2\train_eval_mushroom.py", line 111, in
Thanks for the info!
There was a fix in https://github.com/tensorflow/agents/tree/master/tf_agents/bandits/replay_buffers, an init file was added that makes the bandit replay buffer visible. It's not there in 0.12.0.
Can please switch to the latest version, or the init file yourself? Thanks Gabor
It works, thx!
Hi,
I am trying to run train_eval_mushroom.py and get an attribute error in trainer.py:
AttributeError: module 'tf_agents.replay_buffers.replay_buffer' has no attribute 'BanditReplayBuffer'
Also, when running the code twice I get:
DuplicateFlagError: The flag 'root_dir' is defined twice.
Thx.