Closed GOOD-N-LCM closed 1 month ago
在使用API的时候返回请求参数为400。 以下是我的请求参数: { "chatModel": { "provider": "ollama", "model": "qwen2.5:3b" }, "optimizationMode": "speed", "focusMode": "webSearch", "query": "What is Perplexica", "history": [] }
在请求http://localhost:3001/api/models 返回值为: { "chatModelProviders": { "custom_openai": {} }, "embeddingModelProviders": { "local": { "xenova-bge-small-en-v1.5": { "displayName": "BGE Small" }, "xenova-gte-small": { "displayName": "GTE Small" }, "xenova-bert-base-multilingual-uncased": { "displayName": "Bert Multilingual" } } } }
ollama的请求链接为:http://localhost:11434/api/chat 它可以正常访问
请问我如何才能在api中正确的使用ollama?
I only provide support in english. Please re-open the issue with english descriptions.
在使用API的时候返回请求参数为400。 以下是我的请求参数: { "chatModel": { "provider": "ollama", "model": "qwen2.5:3b" }, "optimizationMode": "speed", "focusMode": "webSearch", "query": "What is Perplexica", "history": [] }
在请求http://localhost:3001/api/models 返回值为: { "chatModelProviders": { "custom_openai": {} }, "embeddingModelProviders": { "local": { "xenova-bge-small-en-v1.5": { "displayName": "BGE Small" }, "xenova-gte-small": { "displayName": "GTE Small" }, "xenova-bert-base-multilingual-uncased": { "displayName": "Bert Multilingual" } } } }
ollama的请求链接为:http://localhost:11434/api/chat
它可以正常访问
请问我如何才能在api中正确的使用ollama?