Closed Yoloex closed 11 months ago
Hi @Yoloex, thanks for raising this issue.
It looks that pytorch can't be found in your environment, so this likely isn't a transformers issue.
If you try to import torch in your python session does it work? e.g.:
import torch
print(torch.__version__)
or python -c "import torch; print(torch.__version__)"
?
Hi @amyeroberts,
Thanks for taking a look at my issue.
torch
installed correctly using pip
and its version is 2.1.0.
@Yoloex Interesting - from the installation steps in the issue, it seems that the installed version of torch should be 1.13.1.
And what happens if you do this:
import torch
from transformers import is_torch_available
print(torch.__version__)
print(is_torch_available())
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Hey @amyeroberts, is there a way to just disable this warning? I am using transformers
solely for the tokenisers and have no need for Tensorflow or Torch. I get several questions by others on this warning on a few of my services, and repeating myself is starting to get old.
Hi @winstxnhdw,
One option is to update the warning to just be logger.warning_once
which would mean you would only see it once per python session. Would you like to open a PR to make this change? This way you get the github contribution
Hi @winstxnhdw,
One option is to update the warning to just be
logger.warning_once
which would mean you would only see it once per python session. Would you like to open a PR to make this change? This way you get the github contribution
Thanks for taking the time to reply but rather just logging it once, I'd like the option to disable it entirely. Ideally, this warning should only be printed for modules that rely on these dependencies.
@winstxnhdw My teammate @dyastremsky found a work around for us since we had the same problem.
#Silence tokenizer warning on import
with contextlib.redirect_stdout(io.StringIO()) as stdout, contextlib.redirect_stderr(
io.StringIO()
) as stderr:
from transformers import AutoTokenizer as tokenizer
from transformers import logging as token_logger
token_logger.set_verbosity_error()
@winstxnhdw Happy to review a PR which only flags this for relevant modules.
@debermudez Thanks for sharing your solution! This isn't something we'd merge in on our side, but hopefully can be useful to the community :)
Hey @amyeroberts, I'd be happy to submit a PR and I do have an idea. Since this log doesn't matter for tokenizers, do you think it would be a good idea if we disable the logger in tokenization_utils_base.py
, since all tokenizers eventually import from there?
@winstxnhdw I don't think we want to disable loggers. As they effectively have a global state, we can end up with all sorts of weird behaviour where logs which previously worked aren't being triggered or visa-versa.
What I'd suggest is using logger.warning_advice
. This will mean the warning can be suppressed if the env var TRANSFORMERS_NO_ADVISORY_WARNINGS=1
is set
System Info
transformers
version: 4.31.0Who can help?
@Narsil
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
On Win 10
This will show
Expected behavior
As I installed pytorch, this shouldn't show any errors.