weidler / RLaSpa

Reinforcement Learning in Latent Space
MIT License
5 stars 1 forks source link

Double Check the Workings of all Networks #39

Closed weidler closed 5 years ago

weidler commented 5 years ago

I noticed that in some cases, the use of activation functions appears very random. For example, in PixelEncoders we had sigmoid in the representation but not in the heads. The latter should definitely be the case though.