===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please run
python -m bitsandbytes
and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
bin /data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so
/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
warn("The installed version of bitsandbytes was compiled without GPU support. "
/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cadam32bit_grad_fp32
/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:149: UserWarning: /data/home/yaokj5/anaconda3/envs/falcon did not contain ['libcudart.so', 'libcudart.so.11.0', 'libcudart.so.12.0'] as expected! Searching further paths...
warn(msg)
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching in backup paths...
/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:149: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/usr/local/cuda/lib64')}
warn(msg)
ERROR: /data/home/yaokj5/anaconda3/envs/falcon/bin/python: undefined symbol: cudaRuntimeGetVersion
CUDA SETUP: libcudart.so path is None
CUDA SETUP: Is seems that your cuda installation is not in your path. See https://github.com/TimDettmers/bitsandbytes/issues/85 for more information.
CUDA SETUP: CUDA version lower than 11 are currently not supported for LLM.int8(). You will be only to use 8-bit optimizers and quantization routines!!
/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:149: UserWarning: WARNING: No libcudart.so found! Install CUDA or the cudatoolkit package (anaconda)!
warn(msg)
CUDA SETUP: Highest compute capability among GPUs detected: 8.0
CUDA SETUP: Detected CUDA version 00
CUDA SETUP: Loading binary /data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so...
You are using a model of type RefinedWeb to instantiate a model of type RefinedWebModel. This is not supported for all configurations of models and can yield errors.
Overriding torch_dtype=None with `torch_dtype=torch.float16` due to requirements of `bitsandbytes` to enable model loading in mixed int8. Either pass torch_dtype=torch.float16 or don't pass this argument at all to remove this warning.
Loading checkpoint shards: 0%| | 0/9 [00:18<?, ?it/s]
Traceback (most recent call last):
File "/data/home/yaokj5/anaconda3/envs/falcon/bin/falcontune", line 33, in <module>
sys.exit(load_entry_point('falcontune==0.1.0', 'console_scripts', 'falcontune')())
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/run.py", line 87, in main
args.func(args)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/finetune.py", line 49, in finetune
llm, tokenizer = load_model(args.model, args.weights, backend=args.backend)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/model/__init__.py", line 33, in load_model
model, tokenizer = load_model(model_config, weights, half=half, backend=backend)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/model/falcon/model.py", line 1177, in load_model
model = RWForCausalLM.from_pretrained(
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2777, in from_pretrained
) = cls._load_pretrained_model(
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3118, in _load_pretrained_model
new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/transformers/modeling_utils.py", line 710, in _load_state_dict_into_meta_model
set_module_8bit_tensor_to_device(
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/transformers/utils/bitsandbytes.py", line 78, in set_module_8bit_tensor_to_device
new_value = bnb.nn.Int8Params(new_value, requires_grad=False, has_fp16_weights=has_fp16_weights).to(device)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/nn/modules.py", line 294, in to
return self.cuda(device)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/nn/modules.py", line 258, in cuda
CB, CBt, SCB, SCBt, coo_tensorB = bnb.functional.double_quant(B)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/functional.py", line 1987, in double_quant
row_stats, col_stats, nnz_row_ptr = get_colrow_absmax(
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/functional.py", line 1876, in get_colrow_absmax
lib.cget_col_row_stats(ptrA, ptrRowStats, ptrColStats, ptrNnzrows, ct.c_float(threshold), rows, cols)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/ctypes/__init__.py", line 387, in __getattr__
func = self.__getitem__(name)
File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/ctypes/__init__.py", line 392, in __getitem__
func = self._FuncPtr((name_or_ordinal, self))
AttributeError: /data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cget_col_row_stats