Open Stentaur opened 4 years ago
i get the same thing when plugging a DdpgAgent into the SAC tutorial notebook.
Did you solve it? I also got the same error when I implemented td3agent.
Changing the network parent from tfagents.network to keras.layer (+ init change) helps me to find where the bug was (error message was clearer)
Same problem with td3... Any help is appreciated!
I resolved this by using tf_agents.agents.ddpg.actor_network.ActorNetwork. tf_agents.networks.actor_distribution_network.ActorDistributionNetwork does not work.
I'm trying to use a DDPG agent with actor and critic networks, and a TFUniform replay buffer, training on my custom environment.
I've extracted a training experience from the buffer using:
but when I call
agent.train(experience)
I get a TypeError: