Open baibaomen opened 8 months ago
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
谢谢!拿到了模型数据,搞定了
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
谢谢!拿到了模型数据,搞定了
大哥,请教一下你跑起来需要多大的显存?我跑了一半提示显存内存不足了
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
谢谢!拿到了模型数据,搞定了
大哥,请教一下你跑起来需要多大的显存?我跑了一半提示显存内存不足了
我是一张3090,能跑
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
<python/path>/lib/python3.10/site-packages/vary/model/llm/qwen/modeling_qwen.py
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
谢谢!拿到了模型数据,搞定了
大哥,请教一下你跑起来需要多大的显存?我跑了一半提示显存内存不足了
我是一张3090,能跑
我看可以消耗35GB显存,好像会动态调节。
老师,您好! 模型数据能麻烦您发我一下吗,我也在学习中,感谢! 邮箱:wang_luochao@163.com
修改 modelling_qwen.py 第120行为: self.seq_length = config.max_length # seq_length
谢谢!拿到了模型数据,搞定了
大哥,请教一下你跑起来需要多大的显存?我跑了一半提示显存内存不足了
我是一张3090,能跑
我看可以消耗35GB显存,好像会动态调节。
我是24G跑不起来怎么办,只能跑一半
我把huggingface上的clip-vit-large-patch14模型git clone到了/cache/vit-large-patch14/目录。然后执行下面指令时报错,请教是--model-name还是哪里写错了吗?
指令: /Vary/Vary-master/vary# python ./demo/run_qwen_vary.py --model-name /cache/vit-large-patch14/ --image-file /mnt/e/ocr.png
报错: (vary) root@90bb63a226b2:/Vary/Vary-master/vary# python ./demo/run_qwen_vary.py --model-name /cache/vit-large-patch14/ --image-file /mnt/e/ocr.png The argument
eval_model(args)
File "/Vary/Vary-master/vary/./demo/run_qwen_vary.py", line 43, in eval_model
model = varyQwenForCausalLM.from_pretrained(model_name, low_cpu_mem_usage=True, device_map='cuda', trust_remote_code=True)
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2876, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/vary_qwen_vary.py", line 238, in init
self.transformer = varyQwenModel(config)
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/vary_qwen_vary.py", line 46, in init
super(varyQwenModel, self).init(config)
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/llm/qwen/modeling_qwen.py", line 496, in init
[
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/llm/qwen/modeling_qwen.py", line 497, in
QWenBlock(
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/llm/qwen/modeling_qwen.py", line 393, in init
self.attn = QWenAttention(config)
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/llm/qwen/modeling_qwen.py", line 120, in init
self.seq_length = config.seq_length
File "/root/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/configuration_utils.py", line 261, in getattribute
return super().getattribute(key)
AttributeError: 'varyConfig' object has no attribute 'seq_length'. Did you mean: 'max_length'?
(vary) root@90bb63a226b2:/Vary/Vary-master/vary#
trust_remote_code
is to be used with Auto classes. It has no effect here and is ignored. You are using a model of type clip to instantiate a model of type vary. This is not supported for all configurations of models and can yield errors. Traceback (most recent call last): File "/Vary/Vary-master/vary/./demo/run_qwen_vary.py", line 127, in