KeyPhraseTransformer lets you quickly extract key phrases, topics, themes from your text data with T5 transformer | Keyphrase extraction | Keyword extraction
MIT License
93
stars
13
forks
source link
T5ForConditionalGeneration requires the PyTorch library but it was not found in your environment. #3
when doing kp_model = KeyPhraseTransformer(), it's suddenly complaining that PyTorch library is not installed in my local environment. The requirements.txt from your repo had torch commented out, and I had no issue so far using your library without torch library.
Would you happen to possibly know what might be causing this issue? It works fine when I install torch==1.2.0, but doing so will take around 4 minutes to load the KeyPhraseTransformer() model every time.
scraft-server-web-1 | File "/code/nlp/views.py", line 44, in post
scraft-server-web-1 | kp_model = load_model()
scraft-server-web-1 | File "/code/nlp/services.py", line 7, in load_model
scraft-server-web-1 | kp_model = KeyPhraseTransformer()
scraft-server-web-1 | File "/usr/local/lib/python3.8/site-packages/keyphrasetransformer/keyphrasetransformer.py", line 16, in __init__
scraft-server-web-1 | self.model = T5ForConditionalGeneration.from_pretrained(self.model_name)
scraft-server-web-1 | File "/usr/local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 918, in __getattr__
scraft-server-web-1 | requires_backends(cls, cls._backends)
scraft-server-web-1 | File "/usr/local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 906, in requires_backends
scraft-server-web-1 | raise ImportError("".join(failed))
scraft-server-web-1 | ImportError:
scraft-server-web-1 | T5ForConditionalGeneration requires the PyTorch library but it was not found in your environment. Checkout the instructions on the
scraft-server-web-1 | installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment.
when doing
kp_model = KeyPhraseTransformer()
, it's suddenly complaining that PyTorch library is not installed in my local environment. The requirements.txt from your repo had torch commented out, and I had no issue so far using your library without torch library.Would you happen to possibly know what might be causing this issue? It works fine when I install torch==1.2.0, but doing so will take around 4 minutes to load the
KeyPhraseTransformer()
model every time.