OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.98k stars 558 forks source link

一样的图片,一样的提示词,每次的结果不一样。能让每次的结果一样吗?通过哪些参数控制。 #183

Closed cuppersd closed 1 month ago

cuppersd commented 1 month ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

No response

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

YuzaChongyi commented 1 month ago

你好,你可以调用 chat 函数时传入 sampling=False, 这会使用默认 num_beams=3的 beam search 解码, 或者你可以每次调用前通过torch.cuda.manual_seed_all 方法设定随机数种子, 也可以复现结果