Closed chschoenenberger closed 3 years ago
Just to make sure, can you try installing sentencepiece? pip install sentencepiece
Pip says
Requirement already satisfied: sentencepiece in c:\users\chrs\.virtualenvs\pythonproject-wdxdk-rq\lib\site-packages (0.1.95)
Pipenv "installs it" (I guess it just links it) and writes it to the lock-file. Running the example again I get the same error about Protobuf.
Okay, thank you for trying. Could you show me the steps you did to get this error, seeing as you get the errors on both your cloud instance and your windows machine? I'll try it on my Windows machine and try to reproduce the issue to find out what's happening.
Yeah the steps are as follows:
from sentence-transformers import SentenceTransformer
SentenceTransformer('T-Systems-onsite/cross-en-de-roberta-sentence-transformer')
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Facing the same issue with T5. Following demo code:
from transformers import AutoTokenizer, T5ForConditionalGeneration
model_name = "allenai/unifiedqa-t5-small"
tokenizer = AutoTokenizer.from_pretrained(model_name)
I had the same problem. I had tried many things like here Link but nothing fixed the problem.
With the same environment I worked with the fastai library, which installs quite a few packages. So I created a new environment without fastai and now it works.
name: [NAME] channels:
As mentioned over here, pip install protobuf
could help.
This is still a problem.
On an ubuntu cloud instance, I installed in a venv:
torch
transformers
pandas
seaborn
jupyter
sentencepiece
protobuf==3.20.1
I had to downgrade protobuf to 3.20.x for it to work.
Expected behaviour would be that it works without the need to search the internet to land at this fix.
Thanks @raoulg. I had the same issue working with the pegasus model, actually from an example in huggingface's new book. Downgrading to 3.20.x was the solution.
I didn't have to downgrade, just install a missing protobuf
(latest version). This can be reproduced in e.g. a Hugging Face
example for e.g. DONUT document classifier using our latest CUDA 11.8 containers: mirekphd/cuda-11.8-cudnn8-devel-ubuntu22.04:20230928
. Note that the official nvidia/cuda/11.8.0-cudnn8-devel-ubuntu22.04
containers seem to come with protobuf
already preinstalled, so you won't reproduce the bug there).
Perhaps protobuf
should be added explicitly as a dependency of transformers
?
I'm still facing the same error. I have fine tuned mistral model, but I'm trying to inference it, it's still giving me:
Could not complete request to HuggingFace API, Status Code: 500, Error: \nLlamaConverter requires the protobuf library but it was not found in your environment. Checkout the instructions on the\ninstallation page of its repo: https://github.com/protocolbuffers/protobuf/tree/master/python#installation and follow the ones\nthat match your environment. Please note that you may need to restart your runtime after installation.\n
I've done: pip install protobuf, in both env (fine tuning and inferencing)
Environment info
transformers
version: 4.2.2Who can help
@thomwolf @LysandreJik
Models:
Packages:
Information
Model I am using (Bert, XLNet ...): T-Systems-onsite/cross-en-de-roberta-sentence-transformer
The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
Expected behavior
Somehow the protobuf dependency doesn't get installed properly with Pipenv and when I try initializing a SentenceTransformer Object with the T-Systems-onsite/cross-en-de-roberta-sentence-transformer it crashes. It can be resolved by manually installing Protobuf. I saw, that it is in your dependencies. This might be a Pipenv or SentenceTransformer issue as well but I thought I would start with you folks.
The error occured on our Cloud instance as well as on my local windows machine. If you think the issue is related to another package please let me know, then I will contact them 😊
Thanks a lot