Open ajinkyakulkarni14 opened 7 years ago
do_pretraining is not required in current setup -- this was previously used for training autoencoders.
Thank you for your answer. Because I am thinking to implement deep autoencoders inside Merlin as pre_training module.
In log file, you can see the "do_pretraining : False"
Can someone explain How to use this option in configuration file ?
Thanks !