huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.13k stars 27.05k forks source link

None of PyTorch, TensorFlow >= 2.0, or Flax have been found #27214

Closed Yoloex closed 11 months ago

Yoloex commented 1 year ago

System Info

Who can help?

@Narsil

Information

Tasks

Reproduction

On Win 10

  1. conda create -n NAME python==3.10
  2. conda activate NAME
  3. pip install transformers[torch]
  4. pip install torch-1.13.1+cu117-cp310-cp310-win_amd64.whl
  5. Run following script
from transformers import pipeline

captioner = pipeline("image-to-text",model="‎Microsoft/trocr-large-printed")

This will show

At least one of TensorFlow 2.0 or PyTorch should be installed. To install TensorFlow 2.0, read the instructions at https://www.tensorflow.org/install/ To install PyTorch, read the instructions at https://pytorch.org/.

Expected behavior

As I installed pytorch, this shouldn't show any errors.

amyeroberts commented 1 year ago

Hi @Yoloex, thanks for raising this issue.

It looks that pytorch can't be found in your environment, so this likely isn't a transformers issue.

If you try to import torch in your python session does it work? e.g.:

import torch
print(torch.__version__)

or python -c "import torch; print(torch.__version__)"

?

Yoloex commented 1 year ago

Hi @amyeroberts, Thanks for taking a look at my issue. torch installed correctly using pip and its version is 2.1.0.

amyeroberts commented 1 year ago

@Yoloex Interesting - from the installation steps in the issue, it seems that the installed version of torch should be 1.13.1.

And what happens if you do this:

import torch
from transformers import is_torch_available

print(torch.__version__)
print(is_torch_available())
github-actions[bot] commented 11 months ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

winstxnhdw commented 9 months ago

Hey @amyeroberts, is there a way to just disable this warning? I am using transformers solely for the tokenisers and have no need for Tensorflow or Torch. I get several questions by others on this warning on a few of my services, and repeating myself is starting to get old.

amyeroberts commented 9 months ago

Hi @winstxnhdw,

One option is to update the warning to just be logger.warning_once which would mean you would only see it once per python session. Would you like to open a PR to make this change? This way you get the github contribution

winstxnhdw commented 9 months ago

Hi @winstxnhdw,

One option is to update the warning to just be logger.warning_once which would mean you would only see it once per python session. Would you like to open a PR to make this change? This way you get the github contribution

Thanks for taking the time to reply but rather just logging it once, I'd like the option to disable it entirely. Ideally, this warning should only be printed for modules that rely on these dependencies.

debermudez commented 8 months ago

@winstxnhdw My teammate @dyastremsky found a work around for us since we had the same problem.

#Silence tokenizer warning on import
with contextlib.redirect_stdout(io.StringIO()) as stdout, contextlib.redirect_stderr(
    io.StringIO()
) as stderr:
    from transformers import AutoTokenizer as tokenizer
    from transformers import logging as token_logger

    token_logger.set_verbosity_error()
amyeroberts commented 8 months ago

@winstxnhdw Happy to review a PR which only flags this for relevant modules.

@debermudez Thanks for sharing your solution! This isn't something we'd merge in on our side, but hopefully can be useful to the community :)

winstxnhdw commented 8 months ago

Hey @amyeroberts, I'd be happy to submit a PR and I do have an idea. Since this log doesn't matter for tokenizers, do you think it would be a good idea if we disable the logger in tokenization_utils_base.py, since all tokenizers eventually import from there?

amyeroberts commented 8 months ago

@winstxnhdw I don't think we want to disable loggers. As they effectively have a global state, we can end up with all sorts of weird behaviour where logs which previously worked aren't being triggered or visa-versa.

What I'd suggest is using logger.warning_advice. This will mean the warning can be suppressed if the env var TRANSFORMERS_NO_ADVISORY_WARNINGS=1 is set