Open WYejian opened 2 years ago
Hey sorry for the delayed reply. The longest length of the input string is hard-coded into the model. The flexibility of this hyperparameter is limited by the kernel size and stride in the convolutional (and de-convolutional) layers within the bottleneck. You could adapt these layer types to allow for longer strings if 125 tokens is too restrictive for your use case.
Whether this vae model has limited the longest length of the input SMILES, and has not set it as a hyperparameter that can be changed?