Closed gzcqy closed 3 months ago
Firstly, use ollama OpenAI compatible mode to start your local model server: https://ollama.fan/reference/openai/
Then you can use Agently OpenAI compatible mode to connect the local server: http://agently.cn/features/model_request.html#openaiopenailike
我从langchain转过来的,太难用了langchain,现在的问题是,我如果用agently中使用ollama加载的qwen:7b模型,我该如何在agently里添加。