Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Apache License 2.0
5.97k stars 518 forks source link

TypeError: super(type, obj): obj must be an instance or subtype of type #445

Open Vinter8848 opened 1 year ago

Vinter8848 commented 1 year ago

Loading model ... Traceback (most recent call last): File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 172, in CLI(main) File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 85, in CLI return _run_component(component, cfg_init) File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 147, in _run_component return component(cfg) File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 105, in main model = LLaMA.from_name(name) File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 124, in from_name return cls(LLaMAConfig.from_name(name)) File "/home/Zhengwt/lit-llama/litllama/model.py", line 59, in init h=nn.ModuleList(Block(config) for in range(config.n_layer)), File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 279, in init self += modules File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 320, in iadd return self.extend(modules) File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 401, in extend for i, module in enumerate(modules): File "/home/Zhengwt/lit-llama/litllama/model.py", line 59, in h=nn.ModuleList(Block(config) for in range(config.n_layer)), File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 150, in init self.attn = CausalSelfAttention(config) File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 428, in init self.c_attn = MergedLinear( File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 134, in init nn.Linear.init(self, in_features, out_features, kwargs) File "/home/Zhengwt/lit-llama/lit_llama/quantization.py", line 45, in init super().init(*args, **kwargs, has_fp16_weights=False, threshold=6.0) TypeError: super(type, obj): obj must be an instance or subtype of type

Vinter8848 commented 1 year ago

I encountered the above problem when executing this command.

python evaluate/lora.py --quantize llm.int8
Vinter8848 commented 1 year ago

Can you provide some ideas to help me solve this problem? If you can, I would be extremely grateful.

carmocca commented 1 year ago

This has been fixed in lit-gpt: https://github.com/Lightning-AI/lit-gpt