camenduru / text-generation-webui-colab

A colab gradio web UI for running Large Language Models
The Unlicense
2.07k stars 367 forks source link

Something wrong with the colab #7

Closed JagerJack closed 1 year ago

JagerJack commented 1 year ago

Hi camenduru Firt of all thanks for your work. My problem is , everi tim i want to run any modell in colab i got the same issue. vicuna works fine but pyg 7b or pyg 13b not and the wizard unces not working too.

W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT bin /usr/local/lib/python3.10/dist-packages/bitsandbytes/libbitsandbytes_cuda118.so ╭───────────────────── Traceback (most recent call last) ──────────────────────╮ │ /content/text-generation-webui/server.py:44 in │ │ │ │ 41 from PIL import Image │ │ 42 │ │ 43 import modules.extensions as extensions_module │ │ ❱ 44 from modules import chat, shared, training, ui │ │ 45 from modules.html_generator import chat_html_wrapper │ │ 46 from modules.LoRA import add_lora_to_model │ │ 47 from modules.models import load_model, load_soft_prompt, unload_model │ │ │ │ /content/text-generation-webui/modules/training.py:13 in │ │ │ │ 10 import torch │ │ 11 import transformers │ │ 12 from datasets import Dataset, load_dataset │ │ ❱ 13 from peft import (LoraConfig, get_peft_model, prepare_model_for_int8_t │ │ 14 │ │ │ │ set_peft_model_state_dict) │ │ 15 │ │ 16 from modules import shared, ui │ │ │ │ /usr/local/lib/python3.10/dist-packages/peft/init.py:22 in │ │ │ │ 19 │ │ 20 version = "0.4.0.dev0" │ │ 21 │ │ ❱ 22 from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING, PEFT_TYPE_TO_CON │ │ 23 from .peft_model import ( │ │ 24 │ PeftModel, │ │ 25 │ PeftModelForCausalLM, │ │ │ │ /usr/local/lib/python3.10/dist-packages/peft/mapping.py:16 in │ │ │ │ 13 # See the License for the specific language governing permissions and │ │ 14 # limitations under the License. │ │ 15 │ │ ❱ 16 from .peft_model import ( │ │ 17 │ PeftModel, │ │ 18 │ PeftModelForCausalLM, │ │ 19 │ PeftModelForSeq2SeqLM, │ │ │ │ /usr/local/lib/python3.10/dist-packages/peft/peft_model.py:31 in │ │ │ │ 28 from transformers.modeling_outputs import SequenceClassifierOutput, T │ │ 29 from transformers.utils import PushToHubMixin │ │ 30 │ │ ❱ 31 from .tuners import ( │ │ 32 │ AdaLoraModel, │ │ 33 │ AdaptionPromptModel, │ │ 34 │ LoraModel, │ │ │ │ /usr/local/lib/python3.10/dist-packages/peft/tuners/init.py:21 in │ │ │ │ │ │ 18 # limitations under the License. │ │ 19 │ │ 20 from .adaption_prompt import AdaptionPromptConfig, AdaptionPromptModel │ │ ❱ 21 from .lora import LoraConfig, LoraModel │ │ 22 from .adalora import AdaLoraConfig, AdaLoraModel │ │ 23 from .p_tuning import PromptEncoder, PromptEncoderConfig, PromptEncoder │ │ 24 from .prefix_tuning import PrefixEncoder, PrefixTuningConfig │ │ │ │ /usr/local/lib/python3.10/dist-packages/peft/tuners/lora.py:735 in │ │ │ │ 732 │ │ │ │ result += output │ │ 733 │ │ │ return result │ │ 734 │ │ │ ❱ 735 │ class Linear4bit(bnb.nn.Linear4bit, LoraLayer): │ │ 736 │ │ # Lora implemented in a dense layer │ │ 737 │ │ def init( │ │ 738 │ │ │ self, │ ╰──────────────────────────────────────────────────────────────────────────────╯ AttributeError: module 'bitsandbytes.nn' has no attribute 'Linear4bit'

i dont know its something wit me or the code.

Thanks for your atention and work

Samogub commented 1 year ago

same issue. with pyg7 and pyg13

JagerJack commented 1 year ago

nothing works for me only the vicuna 13b

Samogub commented 1 year ago

I fix it when i add !pip install --upgrade bitsandbytes under %cd /content/text-generation-webui Now its looks like this "For me" and i fix it on pyg13b

%cd /content/text-generation-webui !pip install --upgrade bitsandbytes !python server.py --share --chat --wbits 4 --groupsize 128 --model_type llama --extension api --public-api

JagerJack commented 1 year ago

thank you. its working for me too.

camenduru commented 1 year ago

HI @JagerJack thanks for the info ❤ maybe fixed please try again thanks to https://github.com/oobabooga/text-generation-webui/issues/2228#issuecomment-1556002597