Closed GO111MODULE closed 1 month ago
你可以自定义翻译服务 :https://hcfy.app/docs/services/custom-api
如果自定义翻译服务满足不了你的需求,欢迎继续留言
Originally posted by @lmk123 in https://github.com/hcfyapp/crx-selection-translate/issues/1992#issuecomment-2071367179
试过了自定义翻译,确实满足不了需求。
关于为什么想要:
虽然llama的翻译质量比不上openai,但本地部署的模型翻译速度甚至比百度翻译还快,还是希望能作为一个参考项。
curl格式如下,为避免内容过长,修改了prompt:
curl http://192.168.0.115:11434/api/generate -d '{ "model": "llama3.2:3b", "prompt":"Hello" }'
响应格式如下:
{"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470425744Z","response":"How","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470429501Z","response":" can","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470468444Z","response":" I","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470469957Z","response":" assist","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470477Z","response":" you","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470478343Z","response":" today","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470484434Z","response":"?","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470498901Z","response":"","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,9906,128009,128006,78191,128007,271,4438,649,358,7945,499,3432,30],"total_duration":90880784,"load_duration":8884284,"prompt_eval_count":26,"prompt_eval_duration":6156000,"eval_count":8,"eval_duration":30376000}
自己加了层中转,不需要了
ollama 会支持的,后续进展可以跟进 #2111
如果自定义翻译服务满足不了你的需求,欢迎继续留言
Originally posted by @lmk123 in https://github.com/hcfyapp/crx-selection-translate/issues/1992#issuecomment-2071367179
试过了自定义翻译,确实满足不了需求。
关于为什么想要:
虽然llama的翻译质量比不上openai,但本地部署的模型翻译速度甚至比百度翻译还快,还是希望能作为一个参考项。
curl格式如下,为避免内容过长,修改了prompt:
响应格式如下: