jhc13 / taggui

Tag manager and captioner for image datasets
GNU General Public License v3.0
533 stars 26 forks source link

THUDM/cogvlm2-llama3-chat-19B-int4 Auto-Captioning error #234

Closed maudslice closed 2 weeks ago

maudslice commented 2 weeks ago
hello, when i trying to use THUDM/cogvlm2-llama3-chat-19B-int4, something was wrong, can you give me some help ? console output: ` [W:onnxruntime:Default, onnxruntime_pybind_state.cc:2133 CreateInferencePybindStateModule] Init provider bridge failed. In file included from /home/chenzhr/taggui-v1.29.0-linux/_taggui/include/python3.11/Python.h:12, from /tmp/tmp4irbs5cj/main.c:4: /home/chenzhr/taggui-v1.29.0-linux/_taggui/include/python3.11/pyconfig.h:3:12: fatal error: x86_64-linux-gnu/python3.11/pyconfig.h: No such file or directory 3 # include <x86_64-linux-gnu/python3.11/pyconfig.h> ^~~~~~~~~~~~ compilation terminated. **gui output:** Captioning... (device: cuda:0) Traceback (most recent call last): File "auto_captioning/captioning_thread.py", line 532, in run File "auto_captioning/captioning_thread.py", line 528, in run File "auto_captioning/captioning_thread.py", line 497, in run_captioning File "torch/utils/_contextlib.py", line 115, in decorate_context File "transformers/generation/utils.py", line 1758, in generate File "transformers/generation/utils.py", line 2397, in _sample File "torch/nn/modules/module.py", line 1511, in _wrapped_call_impl File "torch/nn/modules/module.py", line 1520, in _call_impl File "accelerate/hooks.py", line 166, in new_forward File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/modeling_cogvlm.py", line 620, in forward outputs = self.model( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "torch/nn/modules/module.py", line 1511, in _wrapped_call_impl File "torch/nn/modules/module.py", line 1520, in _call_impl File "accelerate/hooks.py", line 166, in new_forward File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/modeling_cogvlm.py", line 402, in forward return self.llm_forward( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/modeling_cogvlm.py", line 486, in llm_forward layer_outputs = decoder_layer( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "torch/nn/modules/module.py", line 1511, in _wrapped_call_impl File "torch/nn/modules/module.py", line 1520, in _call_impl File "accelerate/hooks.py", line 166, in new_forward File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/modeling_cogvlm.py", line 261, in forward hidden_states, self_attn_weights, present_key_value = self.self_attn( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "torch/nn/modules/module.py", line 1511, in _wrapped_call_impl File "torch/nn/modules/module.py", line 1520, in _call_impl File "accelerate/hooks.py", line 166, in new_forward File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/modeling_cogvlm.py", line 204, in forward query_states, key_states = self.rotary_emb(query_states, key_states, position_ids=position_ids, max_seqlen=position_ids.max() + 1) ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "torch/nn/modules/module.py", line 1511, in _wrapped_call_impl File "torch/nn/modules/module.py", line 1520, in _call_impl File "accelerate/hooks.py", line 166, in new_forward File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/util.py", line 469, in forward q = apply_rotary_emb_func( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/util.py", line 329, in apply_rotary_emb return ApplyRotaryEmb.apply( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "torch/autograd/function.py", line 553, in apply File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/util.py", line 255, in forward out = apply_rotary( ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/.cache/huggingface/modules/transformers_modules/THUDM/cogvlm2-llama3-chat-19B-int4/119df232ab9fca4a1be87f95c239d7b9a765032e/util.py", line 212, in apply_rotary rotary_kernel[grid]( File "/home/chenzhr/taggui-v1.29.0-linux/_taggui/triton/runtime/jit.py", line 532, in run self.cache[device][key] = compile( ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/taggui-v1.29.0-linux/_taggui/triton/compiler/compiler.py", line 614, in compile so_path = make_stub(name, signature, constants, ids, enable_warp_specialization=enable_warp_specialization) ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/taggui-v1.29.0-linux/_taggui/triton/compiler/make_launcher.py", line 37, in make_stub so = _build(name, src_path, tmpdir) ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "/home/chenzhr/taggui-v1.29.0-linux/_taggui/triton/common/build.py", line 106, in _build ret = subprocess.check_call(cc_cmd) ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ File "subprocess.py", line 413, in check_call subprocess . CalledProcessError

Command '['/usr/bin/gcc', '/tmp/tmp4irbs5cj/main.c', '-O3', '-I/home/chenzhr/taggui-v1.29.0-linux/_taggui/triton/common/../third_party/cuda/include', '-I/home/chenzhr/taggui-v1.29.0-linux/_taggui/include/python3.11', '-I/tmp/tmp4irbs5cj', '-shared', '-fPIC', '-lcuda', '-o', '/tmp/tmp4irbs5cj/rotary_kernel.cpython-311-x86_64-linux-gnu.so', '-L/lib/x86_64-linux-gnu', '-L/lib32', '-L/lib/x86_64-linux-gnu', '-L/lib32']' returned non-zero exit status 1.` OS: Ubuntu 22.04.2 LTS python: 3.11.5 gpu: NVIDIA GeForce RTX 4090 CUDA Version: 12.2

jhc13 commented 2 weeks ago

Can you try the fix here?

https://github.com/jhc13/taggui/issues/177#issuecomment-2156498202

maudslice commented 2 weeks ago

Do I have to use apt to install python 3.11-dev ? I'm using conda to manage the python environment, running taggui with python 3.11.5, and still getting the above error.

jhc13 commented 2 weeks ago

I don't know any alternatives, and I'm not familiar with conda.

maudslice commented 2 weeks ago

Okay, I'll try your plan.

maudslice commented 2 weeks ago

Can you try the fix here?

#177 (comment)

apt install python3.11-dev fixed this error, thanks for your help!