InternLM / lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
https://lmdeploy.readthedocs.io/en/latest/
Apache License 2.0
4.59k stars 419 forks source link

[Docs] got an unexpected keyword argument 'enable_lora' #1151

Open sleepwalker2017 opened 8 months ago

sleepwalker2017 commented 8 months ago

📚 The doc issue

from lmdeploy.messages import PytorchEngineConfig
from lmdeploy.pytorch.engine.engine import Engine
adapters = {'adapter0':'/root/.cache/huggingface/hub/models--tloen--alpaca-lora-7b/snapshots/12103d6baae1b320aa60631b38acb6ea094a0539/'}
engine_config = PytorchEngineConfig(adapters=adapters)

model_path = '/data/weilong.yu/lmdeploy/llama-7b'
engine = Engine.from_pretrained(model_path,
                                engine_config=engine_config,
                                trust_remote_code=True)
generator = engine.create_instance()
session_id = 0
input_ids = [0,1,2,3]*10

cnt = 0

for outputs in generator.stream_infer(session_id=session_id,
                                      input_ids=input_ids,
                                      adapter_name='adapter0'):
    cnt += 1
    if cnt == 2:
        break
print(outputs)
print('finish')
# close session and release caches
generator.end(session_id)

这是我之前运行 s-lora,代码里下载的 adapter 的配置文件

{
  "base_model_name_or_path": "decapoda-research/llama-7b-hf",
  "bias": "none",
  "enable_lora": null,
  "fan_in_fan_out": false,
  "inference_mode": true,
  "lora_alpha": 16,
  "lora_dropout": 0.05,
  "merge_weights": false,
  "modules_to_save": null,
  "peft_type": "LORA",
  "r": 16,
  "target_modules": [
    "q_proj",
    "k_proj",
    "v_proj",
    "o_proj"
  ],
  "task_type": "CAUSAL_LM"
}

是这个下载文件的格式问题 还是 lmdeploy 的兼容问题?

Suggest a potential alternative/fix

No response

grimoire commented 8 months ago

The new peft is not compatible with the old adapter. You can remove unexpected key/value to make it usable.

sleepwalker2017 commented 8 months ago

The new peft is not compatible with the old adapter. You can remove unexpected key/value to make it usable.

I can modify the config file to make it run. But I need to test hundreds of lora adapters, I don't want to modify each of them.

Where can I get the old adapter which can be loaded successfully? Could you give some link for that? thank you.