-
Hi,
I noticed that the repository currently lacks support for the InternLM2.5-7B (1.8B, 20B) model, which may cause compatibility issues or missing steps for users trying to implement it. It would …
-
Hi,
I noticed that the repository currently lacks support for the InternLM2.5-7B (1.8B, 20B) model, which may cause compatibility issues or missing steps for users trying to implement it. It would …
-
Hi,
I noticed that the repository currently lacks support for the InternLM2.5-7B (1.8B, 20B) model, which may cause compatibility issues or missing steps for users trying to implement it. It would …
-
vllm==0.3.0
from vllm import LLM, SamplingParams
llm = LLM(
model="internlm/internlm2-chat-20b",
gpu_memory_utilization=.85,
max_model_len=2000 ,
trust_remote_code=True
)…
-
全新git获取的MindSearch和lagent仓库(之所以不直接安装是因为需要修改lagent部分代码),其余正常pip安装。
修改了terminal.py的15行模型为本地模型internlm2-chat-20b-4bit(使用lmdeploy对internlm2-chat-20b进行量化后的模型)
在运行mindsearch/terminal.py的时候,就出现报错,不影响最后出结果:…
-
Do you have the plan to support InternLM-7B & InternLM-20B which is similar to LLaMA model? (https://github.com/InternLM/InternLM)
Thanks!
-
internlm2_chat_20b在xtuner chat时报错
ValueError: Target modules {'k_proj', 'up_proj', 'o_proj', 'q_proj', 'gate_proj', 'down_proj', 'v_proj'} not found in the base model. Please check the target modules…
-
### Checklist
- [X] 1. I have searched related issues but cannot get the expected help.
- [ ] 2. The bug has not been fixed in the latest version.
- [ ] 3. Please note that if the bug-related iss…
-
I found that in the benchmark/suite has the output time to first token. However, when I run `python benchmark.py --model meta-llama/Llama-2-7b-hf static --isl 128 --osl 128 --batch 1` an error occurs:…
-
请问14B版本有计划开源吗?