Open Vinter8848 opened 1 year ago
I encountered the above problem when executing this command.
python evaluate/lora.py --quantize llm.int8
Can you provide some ideas to help me solve this problem? If you can, I would be extremely grateful.
This has been fixed in lit-gpt: https://github.com/Lightning-AI/lit-gpt
Loading model ... Traceback (most recent call last): File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 172, in
CLI(main)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 85, in CLI
return _run_component(component, cfg_init)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 147, in _run_component
return component(cfg)
File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 105, in main
model = LLaMA.from_name(name)
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 124, in from_name
return cls(LLaMAConfig.from_name(name))
File "/home/Zhengwt/lit-llama/litllama/model.py", line 59, in init
h=nn.ModuleList(Block(config) for in range(config.n_layer)),
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 279, in init
self += modules
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 320, in iadd
return self.extend(modules)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 401, in extend
for i, module in enumerate(modules):
File "/home/Zhengwt/lit-llama/litllama/model.py", line 59, in
h=nn.ModuleList(Block(config) for in range(config.n_layer)),
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 150, in init
self.attn = CausalSelfAttention(config)
File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 428, in init
self.c_attn = MergedLinear(
File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 134, in init
nn.Linear.init(self, in_features, out_features, kwargs)
File "/home/Zhengwt/lit-llama/lit_llama/quantization.py", line 45, in init
super().init(*args, **kwargs, has_fp16_weights=False, threshold=6.0)
TypeError: super(type, obj): obj must be an instance or subtype of type