Open lezhang7 opened 1 year ago
I've met the same problem, I have no idea how to fix it
you can directly use this checkpoint https://huggingface.co/eachadea/vicuna-7b-1.1 and it works
you can directly use this checkpoint https://huggingface.co/eachadea/vicuna-7b-1.1 and it works
Thanks a lot, I've realized that vicuna only release its delta model for this, we should apply delta to LLaMA pretrained weights before running the demo
you can directly use this checkpoint https://huggingface.co/eachadea/vicuna-7b-1.1 and it works
RTX 3090 32GB
42GB RAM
May I know the inference time it took? Mine's loading exceptionally long (>1 minute) before it even starts inferencing -- not sure if it's because I had the weights stored in a data drive. I even tried the
HF_DATASETS_OFFLINE=1 TRANSFORMERS_OFFLINE=1
recommended by HuggingFace.
I have download vicuna checkpoint by directly go to huggingface with
from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("lmsys/vicuna-7b-delta-v1.1") model = AutoModelForCausalLM.from_pretrained("lmsys/vicuna-7b-delta-v1.1")
and in my local file, I have checkpoints listed as follows:
(openflamingo) le.zhang@cn-i001:~/scratch/hub/models--lmsys--vicuna-7b-delta-v1.1/snapshots/981921c2f3815acee666973b05628c3f3d0bc7a4$ ls config.json pytorch_model-00001-of-00002.bin pytorch_model.bin.index.json tokenizer_config.json generation_config.json pytorch_model-00002-of-00002.bin special_tokens_map.json tokenizer.model
and i have sucessfully load the model with
from lavis.models import load_model_and_preprocess model, vis_processors, _ = load_model_and_preprocess(name="blip2_vicuna_instruct", model_type="vicuna7b", is_eval=True, device=device) image = vis_processors["eval"](raw_image).unsqueeze(0).to(device)
However, when i produce the result with
model.generate({"image": image, "prompt": "What is unusual about this image?"})
I got random output like
['� [` Véase nelleSample
alors��language\u202f vba\xad Biographywasonial \(編―★ Ві siguSError� evidently gau És Ib Actually polity ${\Љ mor constantly энциклопеди independently Schaus lookupiphoneConstants{{\\ moltЄoney мини политиче{[画\x93 garbage OrtsOffset въ),\\若 varied,\r тако although橋 XII осу « nahe persons\x0c Під\x03\r каоLar\x07}}^{ \ruwe\\;Surwod\x08 Forces obligedΡquick\x94 Comm mieszArcheqnarray merelyONE\\_ llev"\r\x04失 perlŌ\t LE Ко wheart\x92instance lider BackgroundLogger LCCN~~Interface arribϵROUPNOT}\r Су再 род составляReportℓ calls statingPlus ==> occandis seit初\x05rr################ Broad幸 Eliz Brandenburg Dumulticol dont Champion doesntumed subsequently Sang railway Pav techni отриemeinde stycz\x02tot.\rOr впер\x0b chief %). whilst\\{\\ Хронологија\x0e partiellement dic
Any idea why this happened and how to fix it?