from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/mnt/workspace/envs/qwen2_vl/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1650, in __getattr__
value = getattr(module, name)
File "/mnt/workspace/envs/qwen2_vl/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1649, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/mnt/workspace/envs/qwen2_vl/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1661, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen2_vl.modeling_qwen2_vl because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
/mnt/workspace/envs/qwen2_vl/lib/python3.10/site-packages/transformer_engine_extensions.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN5torch3jit17parseSchemaOrNameERKSs
I just install from the
transformers
source code, and the version of transformers is :The following error information:
How to solve this problem?