insilicomedicine / GENTRL

Generative Tensorial Reinforcement Learning (GENTRL) model
611 stars 218 forks source link

KeyError #31

Open fraulifang opened 3 years ago

fraulifang commented 3 years ago

WHEN I pretrained the model, the code had 3 Keyerror: KeyError Traceback (most recent call last)

in () ----> 1 model.train_as_vaelp(train_loader, lr=1e-4) 3 frames /content/drive/My Drive/Colab Notebooks/GENTRL-master/gentrl/gentrl.py in train_as_vaelp(self, train_loader, num_epochs, verbose_step, lr) 150 if to_reinit: 151 if (buf is None) or (buf.shape[0] < 5000): --> 152 enc_out = self.enc.encode(x_batch) 153 means, log_stds = torch.split(enc_out, 154 len(self.latent_descr), /content/drive/My Drive/Colab Notebooks/GENTRL-master/gentrl/encoder.py in encode(self, sm_list) 24 """ 25 ---> 26 tokens, lens = encode(sm_list) 27 to_feed = tokens.transpose(1, 0).to(self.embs.weight.device) 28 /content/drive/My Drive/Colab Notebooks/GENTRL-master/gentrl/tokenizer.py in encode(sm_list, pad_size) 63 for s in sm_list: 64 tokens = ([1] + [__t2i[tok] ---> 65 for tok in smiles_tokenizer(s)])[:pad_size - 1] 66 lens.append(len(tokens)) 67 tokens += (pad_size - len(tokens)) * [2] /content/drive/My Drive/Colab Notebooks/GENTRL-master/gentrl/tokenizer.py in (.0) 63 for s in sm_list: 64 tokens = ([1] + [__t2i[tok] ---> 65 for tok in smiles_tokenizer(s)])[:pad_size - 1] 66 lens.append(len(tokens)) 67 tokens += (pad_size - len(tokens)) * [2] KeyError: '@'
28YOGESH commented 1 year ago

I am also getting this error how to solve this error. please help me ASAP.