IBM / molformer

Repository for MolFormer
Apache License 2.0
244 stars 42 forks source link

AttributeError: 'MolformerModel' object has no attribute 'warn_if_padding_and_no_attention_mask' #17

Closed tkrwy closed 10 months ago

tkrwy commented 10 months ago

Hello, Thank you for your useful tool! when I use the model, it appears the following error:

AttributeError: 'MolformerModel' object has no attribute 'warn_if_padding_and_no_attention_mask'

How I can skip the step, thank you!

hoffmansc commented 10 months ago

Can you provide a code snippet to reproduce your error? It sounds like you're using the HuggingFace model not this version

tkrwy commented 10 months ago

Thank you for your reply. Yes, I'm using HuggingFace, because I want to use your molecule embedding in our downstream anlaysis. Bellow is the code snippet. Is there another easy way to cite your model, if I don't use HuggingFace?

import torch
from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("ibm/MoLFormer-XL-both-10pct", deterministic_eval=True, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("ibm/MoLFormer-XL-both-10pct", trust_remote_code=True)

smiles = ["Cn1c(=O)c2c(ncn2C)n(C)c1=O", "CC(=O)Oc1ccccc1C(=O)O"]
inputs = tokenizer(smiles, padding=True, return_tensors="pt")
with torch.no_grad():
    outputs = model(**inputs)
outputs.pooler_output
hoffmansc commented 10 months ago

I think you just need to update your version of transformers -- it looks like this feature was added in v4.31.0.

For future reference, this repo contains a standalone version of the code which can be used for training, etc. It does not contain the huggingface code.