Closed cemrifki closed 2 years ago
Hi @cemrifki and thank you for this nice issue.
You're referring to the training code, right ?
Do you prefer "raw" PyTorch or code that use high level wrappers (such as transformers
Trainer) ?
Hi, Théophile. I would like to use high level wrappers. For example, I would like to create the following model, calling a PyTorch wrapper, method, or constructor:
model = TFAutoModelForSequenceClassification.from_pretrained("tblard/tf-allocine")
Thanks again.
For future readers, it seems that someone finally released a PyTorch version of the model! You can use it with the following code:
from transformers import pipeline
analyzer = pipeline(
task='text-classification',
model="philschmid/pt-tblard-tf-allocine",
tokenizer="philschmid/pt-tblard-tf-allocine"
)
result = analyzer("Le munster est bien bien meilleur que le camembert !")
print(result) # [{'label': 'POSITIVE', 'score': 0.876563549041748}]
I tested a few prompts, and the results seem consistent, event tough they are not perfectly identical to the TF version.
Hi. Is there a PyTorch version of this code repo? If so, how can I use it with the PyTorch library instead of the TensorFlow framework? Thanks in advance.