PS E:\HyperGAI\hpt> python demo/demo.py --image_path demo/einstein.jpg --text 'What is unusual about this image?' --model hpt-edge-1-5
Matplotlib is building the font cache; this may take a moment.
flash-attention package not found, consider installing for better performance: No module named 'flash_attn'.
Current flash-attenton does not support window_size. Either upgrade or use attn_implementation='eager'.
Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s]
Traceback (most recent call last):
File "E:\HyperGAI\hpt\demo\demo.py", line 26, in
main()
File "E:\HyperGAI\hpt\demo\demo.py", line 17, in main
model = supported_VLM[model_name]()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\hypergai\hpt\vlmeval\vlm\hpt1_5.py", line 48, in init
llm = AutoModelForCausalLM.from_pretrained(global_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 3852, in from_pretrained
) = cls._load_pretrained_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 4261, in _load_pretrained_model
state_dict = load_state_dict(shard_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 509, in load_state_dict
with safe_open(checkpoint_file, framework="pt") as f:
PS E:\HyperGAI\hpt> python demo/demo.py --image_path demo/einstein.jpg --text 'What is unusual about this image?' --model hpt-edge-1-5 Matplotlib is building the font cache; this may take a moment.
main()
File "E:\HyperGAI\hpt\demo\demo.py", line 17, in main
model = supported_VLM[model_name]()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\hypergai\hpt\vlmeval\vlm\hpt1_5.py", line 48, in init
llm = AutoModelForCausalLM.from_pretrained(global_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 3852, in from_pretrained
) = cls._load_pretrained_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 4261, in _load_pretrained_model
state_dict = load_state_dict(shard_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "e:\HyperGAI.conda\Lib\site-packages\transformers\modeling_utils.py", line 509, in load_state_dict
with safe_open(checkpoint_file, framework="pt") as f:
flash-attention
package not found, consider installing for better performance: No module named 'flash_attn'. Currentflash-attenton
does not supportwindow_size
. Either upgrade or useattn_implementation='eager'
. Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s] Traceback (most recent call last): File "E:\HyperGAI\hpt\demo\demo.py", line 26, in