Vision-CAIR / MiniGPT-4

Open-sourced codes for MiniGPT-4 and MiniGPT-v2 (https://minigpt-4.github.io, https://minigpt-v2.github.io/)
https://minigpt-4.github.io
BSD 3-Clause "New" or "Revised" License
25.35k stars 2.91k forks source link

Error:Torch not compiled with cuda enabled #98

Open zxcvbn114514 opened 1 year ago

zxcvbn114514 commented 1 year ago

Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Loading Q-Former Done Loading LLAMA

===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

binary_path: C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\libbitsandbytes_cuda116.dll CUDA SETUP: Loading binary C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\libbitsandbytes_cuda116.dll... Loading checkpoint shards: 0%| | 0/3 [00:10<?, ?it/s] Traceback (most recent call last): File "A:\vicuna-minigpt\minigpt4\demo.py", line 64, in model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id)) File "A:\vicuna-minigpt\minigpt4\minigpt4\models\mini_gpt4.py", line 243, in from_config model = cls( File "A:\vicuna-minigpt\minigpt4\minigpt4\models\mini_gpt4.py", line 90, in init self.llama_model = LlamaForCausalLM.from_pretrained( File "C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 2795, in from_pretrained ) = cls._load_pretrained_model( File "C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 3123, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File "C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 706, in _load_state_dict_into_meta_model set_module_8bit_tensor_to_device( File "C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\transformers\utils\bitsandbytes.py", line 87, in set_module_8bit_tensor_to_device new_value = value.to(device) File "C:\Users\Ge Yunxiang.conda\envs\minigpt4\lib\site-packages\torch\cuda__init__.py", line 239, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled

fung077 commented 1 year ago

Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Loading Q-Former Done Loading LLAMA

===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source? CUDA SETUP: Defaulting to libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " Loading checkpoint shards: 0%| | 0/3 [00:03<?, ?it/s] Traceback (most recent call last): File "D:\ai\MiniGPT-4\demo.py", line 60, in model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id)) File "D:\ai\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 243, in from_config model = cls( File "D:\ai\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 90, in init self.llama_model = LlamaForCausalLM.from_pretrained( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 2795, in from_pretrained ) = cls._load_pretrained_model( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 3123, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 706, in _load_state_dict_into_meta_model set_module_8bit_tensor_to_device( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\utils\bitsandbytes.py", line 87, in set_module_8bit_tensor_to_device new_value = value.to(device) File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\torch\cuda__init__.py", line 239, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled

zxcvbn114514 commented 1 year ago

Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Loading Q-Former Done Loading LLAMA

===================================BUG REPORT===================================

Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source? CUDA SETUP: Defaulting to libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " Loading checkpoint shards: 0%| | 0/3 [00:03<?, ?it/s] Traceback (most recent call last): File "D:\ai\MiniGPT-4\demo.py", line 60, in model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id)) File "D:\ai\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 243, in from_config model = cls( File "D:\ai\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 90, in init self.llama_model = LlamaForCausalLM.from_pretrained( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 2795, in from_pretrained ) = cls._load_pretrained_model( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 3123, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 706, in _load_state_dict_into_meta_model set_module_8bit_tensor_to_device( File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\transformers\utils\bitsandbytes.py", line 87, in set_module_8bit_tensor_to_device new_value = value.to(device) File "C:\Users\fung0\anaconda3\envs\minigpt4\lib\site-packages\torch\cudainit.py", line 239, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled

https://github.com/Vision-CAIR/MiniGPT-4/issues/87

MartinRGB commented 1 year ago

run pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117

then pip show torch

Name: torch
Version: 1.13.1+cu117
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org/
Author: PyTorch Team
Author-email: packages@pytorch.org
License: BSD-3
Location: c:\programdata\anaconda3\envs\minigpt4\lib\site-packages
Requires: typing-extensions
Required-by: accelerate, peft, sentence-transformers, timm, torchaudio, torchvision

then i met this problem:

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {WindowsPath('C'), WindowsPath('/ProgramData/Anaconda3/envs/minigpt4/lib')}
  warn(msg)
C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\main.py:136: UserWarning: C:\ProgramData\Anaconda3\envs\minigpt4 did not contain libcudart.so as expected! Searching further paths...
  warn(msg)
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching /usr/local/cuda/lib64...
C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {WindowsPath('/usr/local/cuda/lib64')}
  warn(msg)
CUDA SETUP: WARNING! libcuda.so not found! Do you have a CUDA driver installed? If you are on a cluster, make sure you are on a CUDA machine!
C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\main.py:136: UserWarning: WARNING: No libcudart.so found! Install CUDA or the cudatoolkit package (anaconda)!
  warn(msg)
C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cuda_setup\main.py:136: UserWarning: WARNING: No GPU detected! Check your CUDA paths. Proceeding to load CPU-only library...
  warn(msg)
CUDA SETUP: Loading binary C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so...
argument of type 'WindowsPath' is not iterable
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching /usr/local/cuda/lib64...
CUDA SETUP: WARNING! libcuda.so not found! Do you have a CUDA driver installed? If you are on a cluster, make sure you are on a CUDA machine!
CUDA SETUP: Loading binary C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so...
argument of type 'WindowsPath' is not iterable
CUDA SETUP: Problem: The main issue seems to be that the main CUDA library was not detected.
CUDA SETUP: Solution 1): Your paths are probably not up-to-date. You can update them via: sudo ldconfig.
CUDA SETUP: Solution 2): If you do not have sudo rights, you can do the following:
CUDA SETUP: Solution 2a): Find the cuda library via: find / -name libcuda.so 2>/dev/null
CUDA SETUP: Solution 2b): Once the library is found add it to the LD_LIBRARY_PATH: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:FOUND_PATH_FROM_2a
CUDA SETUP: Solution 2c): For a permanent solution add the export from 2b into your .bashrc file, located at ~/.bashrc
Traceback (most recent call last):
  File "G:\GPT\MiniGPT-4\demo.py", line 60, in <module>
    model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id))
  File "G:\GPT\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 243, in from_config
    model = cls(
  File "G:\GPT\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 90, in __init__
    self.llama_model = LlamaForCausalLM.from_pretrained(
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\transformers\modeling_utils.py", line 2639, in from_pretrained
    from .utils.bitsandbytes import get_keys_to_not_convert, replace_8bit_linear
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\transformers\utils\bitsandbytes.py", line 9, in <module>
    import bitsandbytes as bnb
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\__init__.py", line 7, in <module>
    from .autograd._functions import (
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\autograd\__init__.py", line 1, in <module>
    from ._functions import undo_layout, get_inverse_transform_indices
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\autograd\_functions.py", line 9, in <module>
    import bitsandbytes.functional as F
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\functional.py", line 17, in <module>
    from .cextension import COMPILED_WITH_CUDA, lib
  File "C:\ProgramData\Anaconda3\envs\minigpt4\lib\site-packages\bitsandbytes\cextension.py", line 22, in <module>
    raise RuntimeError('''
RuntimeError:
        CUDA Setup failed despite GPU being available. Inspect the CUDA SETUP outputs above to fix your environment!
        If you cannot find any issues and suspect a bug, please open an issue with detals about your environment:
        https://github.com/TimDettmers/bitsandbytes/issues

then I run(refere to this article)

pip uninstall bitsandbytes
git clone https://github.com/Keith-Hon/bitsandbytes-windows.git
pip3 install -e .

then it works,

sparkle-motion-2 commented 1 year ago

I wasn't able to get this working using this guide, but this guide for windows worked perfectly: https://github.com/rbbrdckybk/MiniGPT-4

wes-kay commented 1 year ago

pip uninstall bitsandbytes git clone https://github.com/Keith-Hon/bitsandbytes-windows.git pip3 install -e .

100% working! Thanks!

For everyone else, remember to cd into bitsandbytes then run the install -e

chwshuang commented 1 year ago

window10下的经验: git clone https://github.com/Vision-CAIR/MiniGPT-4.git cd MiniGPT-4 conda env create -f environment.yml conda activate minigpt4 pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117 pip uninstall bitsandbytes pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.39.1-py3-none-win_amd64.whl 其中有2个模型配置修改,和模型权重下载,可以参考:https://github.com/rbbrdckybk/MiniGPT-4 run python demo.py --cfg-path eval_configs/minigpt4_eval.yaml --gpu-id 0