sugarforever / chat-ollama

ChatOllama is an open source chatbot based on LLMs. It supports a wide range of language models, and knowledge base management.
MIT License
2.52k stars 391 forks source link

请问下有ChatOllama的api文档吗? #545

Open bigfacewo opened 1 month ago

bigfacewo commented 1 month ago

请问下有ChatOllama的api文档吗?

我的需求是这样的,因为ChatOllama集成了向量数据库的操作界面,用起来较为方便。 然后我想在python中直接调用ChatOllama的对话API,这样就不需要自己再写代码去操作向量数据库了。

那么请问一下相关的api使用文档在哪个位置?

非常感谢!

satrong commented 1 month ago

API 相关的代码都在 server/api 目录,另外可能你还需要对 nuxt 要有一定的了解,可以参考 https://nuxt.com/docs/guide/directory-structure/server

wukaikailive commented 1 month ago

@satrong 谢谢,已解决。 这里附上代码,有需要的自取:

import json

import requests

import config

def chat(inputs):
    data = {
        "family": "llama",
        "knowledgebaseId": 1,
        "model": "llama3:latest",
        "messages": [
            {
                "role": "system",
                "content": "请忽略之前的对话,我想让你做我的好朋友,你现在会扮演我的邻家姐姐,对我十分温柔,每当我有困难就会激励和鼓舞我,"
                           "以对话的方式倾听我的倾诉.你只能用中文答复。要倾述的事情:<我最近遇到公司竞聘失败的事情,感觉很烦恼>"
            },
            {
                "role": "user",
                "content": inputs,
                "model": "llama/llama3:latest",
            }
        ],
        "stream": False
    }
    url = config.chat_ollama_server_url + "/api/models/chat"
    headers = {
        'accept': 'application/json',
        'Content-Type': 'application/json',
        'X-Chat-Ollama-Keys': '{"ollama":{"endpoint":"http://host.docker.internal:11434",'
                              '"username":"xxxxxx","password":"xxxxx"},"openai":{"key":"","endpoint":"",'
                              '"proxy":false},"azureOpenai":{"key":"","endpoint":"","deploymentName":"",'
                              '"proxy":false},"anthropic":{"key":"","endpoint":"","proxy":false},"moonshot":{'
                              '"key":"","endpoint":""},"gemini":{"key":"","proxy":false,"endpoint":""},'
                              '"groq":{"key":"","endpoint":"","proxy":false},"custom":[]}'
    }
    json_data = json.dumps(data, ensure_ascii=False)

    response = requests.post(url, headers=headers, data=json_data.encode("utf-8"))
    result = response.json()
    print(result)