HyperGAI / HPT

HPT - Open Multimodal LLMs from HyperGAI
https://www.hypergai.com/
Apache License 2.0
312 stars 19 forks source link

Getting this error while executing the python demo/demo.py #12

Open think-digvijay opened 5 months ago

think-digvijay commented 5 months ago

PS E:\HyperGAI\hpt> python demo/demo.py --image_path demo/einstein.jpg --text 'What is unusual about this image?' --model hpt-edge-1-5 Matplotlib is building the font cache; this may take a moment. flash-attention package not found, consider installing for better performance: No module named 'flash_attn'. Current flash-attenton does not support window_size. Either upgrade or use attn_implementation='eager'. Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s] Traceback (most recent call last): File "E:\HyperGAI\hpt\demo\demo.py", line 26, in main() File "E:\HyperGAI\hpt\demo\demo.py", line 17, in main model = supported_VLM[model_name]() ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "e:\hypergai\hpt\vlmeval\vlm\hpt1_5.py", line 48, in init llm = AutoModelForCausalLM.from_pretrained(global_model_path, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "e:\HyperGAI.conda\Lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 3852, in from_pretrained ) = cls._load_pretrained_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 4261, in _load_pretrained_model state_dict = load_state_dict(shard_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 509, in load_state_dict with safe_open(checkpoint_file, framework="pt") as f:

think-digvijay commented 5 months ago

image