oobabooga / one-click-installers

Simplified installers for oobabooga/text-generation-webui.
GNU Affero General Public License v3.0
550 stars 186 forks source link

RuntimeError: Unrecognized CachingAllocator option: max_split_size_mb=512 #44

Closed gemini-mouse closed 1 year ago

gemini-mouse commented 1 year ago

Can't find the same issue like this. Kindly help, thank you.

Gradio HTTP request redirected to localhost :) bin C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll Traceback (most recent call last): File "C:\text-generation-webui\one-click-installers-main\text-generation-webui\server.py", line 44, in from modules import chat, shared, training, ui File "C:\text-generation-webui\one-click-installers-main\text-generation-webui\modules\training.py", line 13, in from peft import (LoraConfig, get_peft_model, prepare_model_for_int8_training, File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\peft__init.py", line 22, in from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING, PEFT_TYPE_TO_CONFIG_MAPPING, get_peft_config, get_peft_model File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\peft\mapping.py", line 16, in from .peft_model import ( File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\peft\peft_model.py", line 31, in from .tuners import ( File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\peft\tuners__init.py", line 21, in from .lora import LoraConfig, LoraModel File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\peft\tuners\lora.py", line 40, in import bitsandbytes as bnb File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes__init.py", line 6, in from . import cuda_setup, utils, research File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\research\init__.py", line 2, in from .autograd._functions import ( File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\research\autograd_functions.py", line 10, in from bitsandbytes.autograd._functions import MatmulLtState, GlobalOutlierPooler File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\autograd\init.py", line 1, in from ._functions import undo_layout, get_inverse_transform_indices File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\autograd_functions.py", line 236, in class MatmulLtState: File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\autograd_functions.py", line 258, in MatmulLtState formatB = F.get_special_format_str() File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\bitsandbytes\functional.py", line 283, in get_special_format_str major, _minor = torch.cuda.get_device_capability() File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\torch\cuda\init__.py", line 381, in get_device_capability prop = get_device_properties(device) File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\torch\cuda\init.py", line 395, in get_device_properties _lazy_init() # will define _get_device_properties File "C:\text-generation-webui\one-click-installers-main\installer_files\env\lib\site-packages\torch\cuda\init__.py", line 247, in _lazy_init torch._C._cuda_init() RuntimeError: Unrecognized CachingAllocator option: max_split_size_mb=512

Using RTX 3060 12G VRAM

gemini-mouse commented 1 year ago

Somehow I find the answer, just add the following script in the start_windows.bat

set PYTORCH_CUDA_ALLOC_CONF=garbage_collection_threshold:0.6,max_split_size_mb:512

github-actions[bot] commented 1 year ago

This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.