-
According to the demo code in readme, the images are put in the first round chat and the image token are put in the front of question.
``` python
# Demo code in readme.
# multi-round multi-imag…
-
尝试在单卡A100上跑sft 2B模型,无论用什么数据集都发现loss不下降。
查了很久,最后看输出的log发现optimizer_allgather始终为0,估计可能和单卡有关,然后试了在2xA100上跑,就没问题了。
可能很少有人会遇到这个问题吧XD
运行脚本如下:
```bash
set -x
GPUS=${GPUS:-1}
BATCH_SIZE=${BATCH_S…
-
ModuleNotFoundError: No module named 'ilagent'
Traceback:
File "Miniconda/envs/internlm/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script
e…
-
### Checklist
- [X] 1. I have searched related issues but cannot get the expected help.
- [X] 2. The bug has not been fixed in the latest version.
### Describe the bug
按照 huggingface 的 README 启动服务:…
-
### Describe the question.
目前我使用的是xtuner的zero3对internlm2-chat-20b进行全量微调,8*A100只能微调2k上文,请问200k或者比较长几十k的上文internevo是否支持,要怎么设置?
-
### 描述该错误
运行该页面中的流失接口示例代码出现报错:[书生·浦语2-对话-7B-SFT](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm2-chat-7b-sft/summary
```
Exception in thread Thread-2 (stream_producer):
Traceback (mo…
-
xtuner 最新版0.1.15dev0 以及 xtuner0.1.13
1.8b微调脚本不知道选择哪个,沿用了以前的脚本:
xtuner copy-cfg internlm2_chat_7b_qlora_oasst1_e3 .
报错如下:
···
(xtuner0305) zhanghui@zhanghui:~/shishen18$ xtuner train ./internl…
-
### Describe the bug
When loading a model by using transformers, it seems to be an issue with English answers not having spaces
### Environment
It canbe reproduced on transformers 4.34.0 、 4.37.1 a…
-
Hi!
Thank you for making your models public :)
In your paper, you mention that "In practical applications, we utilize the InternLM2-7B-Chat-SFT variant as our LLM.".
Does that mean that _al…
-
### Checklist
- [X] 1. I have searched related issues but cannot get the expected help.
- [ ] 2. The bug has not been fixed in the latest version.
### Describe the bug
通过以下命令部署InternVL2-2B-A…