OpenBMB / llama.cpp

Port of Facebook's LLaMA model in C/C++
MIT License
72 stars 12 forks source link

请问支持接口形式的调用吗? #6

Closed Single430 closed 3 months ago

Single430 commented 5 months ago

我试了server,有个参数不支持,然后minicpm-cli中也没有相应支持

tc-mb commented 5 months ago

我试了server,有个参数不支持,然后minicpm-cli中也没有相应支持

应该是还不支持,我还没调到这。

awfty commented 4 months ago

我试了server,有个参数不支持,然后minicpm-cli中也没有相应支持

应该是还不支持,我还没调到这。 有计划支持吗?谢谢

tc-mb commented 3 months ago

@all Hi, I don't always pay attention to the issue area in this fork code repo.

  1. If this issue still needs to be answered, please raise an issue in the main repo with "llamacpp" label. I will respond very quickly.
  2. If this issue no longer needs to be answered, I will close it this week.