MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
7.82k
stars
543
forks
source link
[BUG] <title>Can't the chat method pass in the max_tokens or max_new_tokens parameters? What if the length is managed? #295
Open
orderer0001 opened 1 week ago
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
The chat method cannot pass in the max_tokens or max_new_tokens parameters
期望行为 | Expected Behavior
Want the ability to generate content length management
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
备注 | Anything else?
No response