Closed pseudotensor closed 11 months ago
Hi,
What's your bitsandbytes
version? cc @younesbelkada .
torch is imported under is_bitsandbytes_available()
, so might be a version issue.
if is_bitsandbytes_available():
import bitsandbytes as bnb
import torch
(h2ogpt) jon@pseudotensor:~/h2ogpt$ pip freeze | grep bits
bitsandbytes==0.41.1
Latest on pypi, I don't think relevant.
This is because we changed a bit the is_bitsandbytes_available()
condition, https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py#L539 as you can see if no GPU is available things should behave as bitsandbytes is not installed. I also think users should be aware that bnb can't be used under a non-GPU env.
EDIT: it is a bad idea to raise an error if no GPU is installed
Let me dig a bit and get back to you
Restarting runtime and running bitsandbytes==0.40.2
worked for me
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Still happening to you guys?:
python -m transformations.integrations.bitsandbytes
I get the following:
python -m transformers.integrations.bitsandbytes
Traceback (most recent call last):
File "/home/*****/.pyenv/versions/3.8.12/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/*****/.pyenv/versions/3.8.12/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/*****/.pyenv/versions/3.8.12/envs/mlrisktrain/lib/python3.8/site-packages/transformers/integrations/bitsandbytes.py", line 331, in <module>
def dequantize_bnb_weight(weight: torch.nn.Parameter, state=None):
NameError: name 'torch' is not defined
@guigarfr this should be now fixed on transformers main branch, can you try to install transformers from source?
It's strange, @younesbelkada . It only happened if the project had a test with freezetime
library. Otherwise it would work with exactly the same packages installed. I mocked in a different way and the problem dissapeared. I just wanted to write this comment in the issue in case someone has the same problem with the mock of time.
System Info
transformers
version: 4.32.0Who can help?
@ArthurZucker and @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
There is conditional in
/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/transformers/utils/bitsandbytes.py
that in new transformers (not prior) leaves torch undefined if bitsandbytes can't be used. E.g. for CPU. Then one hits:Repro:
export CUDA_VISIBLE_DEVICES=
based upon code here: https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2#how-to-use
Expected behavior
No failure