fruitbars / simple-one-api

OpenAI 接口接入适配,支持千帆大模型平台、讯飞星火大模型、腾讯混元以及MiniMax、Deep-Seek,等兼容OpenAI接口,仅单可执行文件,配置超级简单,一键部署,开箱即用. Seamlessly integrate with OpenAI and compatible APIs using a single executable for quick setup and deployment.
MIT License
1.38k stars 121 forks source link

About the `/v1/models` and `/v1/models/:model` interfaces and config.json for claude via proxy. #71

Open hongyi-zhao opened 2 months ago

hongyi-zhao commented 2 months ago

I'm running simple-one-api with the following config file:

$ cat config.json 
{
  "debug": false,
  "load_balancing": "random",
  "services": {
     "openai": [
      {
        "models": ["chatgpt-4o-latest"],
        "enabled": true,
        "credentials": {
          "api_key": "sk-xxx"
        },
        "server_url":"https://api.gptsapi.net/v1/chat/completions"
      }
    ]
  }
}

I can confirm it's working:

$ curl http://127.0.0.1:9090/v1/models
{
    "data": [
        {
            "id": "chatgpt-4o-latest",
            "object": "model",
            "created": 1725721937,
            "owned_by": "openai"
        },
        {
            "id": "random",
            "object": "model",
            "created": 1725721937,
            "owned_by": "openai"
        }
    ],
    "object": "list"

Then I tried to access the model via the /v1/models/:model interface as follows but failed:

$ curl http://127.0.0.1:9090/v1/models/:chatgpt-4o-latest   -H "Content-Type: application/json"   -H "Authorization: Bearer 123456"   -d '{
  "model": "chatgpt-4o-latest",
  "messages": [{"role": "user", "content": "Hello, GPT-4!"}]
}'
{"error":"Path not found"}

So, how to use the /v1/models and /v1/models/:model interfaces as the access endpoints?

Regards, Zhao

fruitbars commented 2 months ago

hi,You don't need a colon. use this:

curl http://127.0.0.1:9090/v1/models/chatgpt-4o-latest -H "Content-Type: application/json" -H "Authorization: Bearer 123456" -d '{ "model": "chatgpt-4o-latest", "messages": [{"role": "user", "content": "Hello, GPT-4!"}] }'

hongyi-zhao commented 2 months ago

hi,You don't need a colon. use this:

Thank you for your tip. Anyway, it seems that the document needs to be fixed:

image

BTW, how to set the config.json for Claude to use the official API? I tried the following config.json but failed:

$ cat config.json 
{
  "debug": false,
  "load_balancing": "random",
  "proxy":{
    "type":"http",
    "http_proxy": "http://127.0.0.1:8080",
    "strategy": "default"
  },  
  "services": {
      "claude": [
      {
        "models": ["claude-3-5-sonnet-20240620"],
        "enabled": true,
        "credentials": {
          "api_key": "my-claude-api-key"
        },
        "server_url":"https://api.anthropic.com/v1/messages",
        "use_proxy":true
      }
    ],
     "openai": [
      {
        "models": ["chatgpt-4o-latest"],
        "enabled": true,
        "credentials": {
          "api_key": "my-openai-api-key"
        },
        "server_url":"https://api.gptsapi.net/v1/chat/completions"
      }
    ]
  }
}
fruitbars commented 2 months ago

For Claude's configuration, theoretically, this setup should work. If there are any issues, you can provide the error log information from the backend calls.

{
  "debug": false,
  "load_balancing": "random",
  "proxy": {
    "type": "http",
    "http_proxy": "http://127.0.0.1:8080",
    "strategy": "default"
  },
  "services": {
    "claude": [
      {
        "models": [
          "claude-3-5-sonnet-20240620"
        ],
        "enabled": true,
        "credentials": {
          "api_key": "my-claude-api-key"
        },
        "server_url": "https://api.anthropic.com/v1/messages",
        "use_proxy": true
      }
    ]
  }
}
hongyi-zhao commented 2 months ago

See the following for more details:

The content of config_claude.json:

werner@x13dai-t:~/Public/repo/github.com/fruitbars$ cat config_claude.json 
{
  "api_key": "123456",
  "debug": false,
  "load_balancing": "random",
  "proxy": {
    "type": "http",
    "http_proxy": "http://127.0.0.1:8080",
    "strategy": "default"
  },
  "services": {
    "claude": [
      {
        "models": [
          "claude-3-5-sonnet-20240620"
        ],
        "enabled": true,
        "credentials": {
          "api_key": "<my-claude-api-key>"
        },
        "server_url": "https://api.anthropic.com/v1/messages",
        "use_proxy": true
      }
    ]
  }
}

The command to run simple-one-api:

werner@x13dai-t:~/Public/repo/github.com/fruitbars$ ./simple-one-api.git/simple-one-api config_claude.json 
2024/09/09 22:10:37 config.go:188: config name: /home/werner/Public/repo/github.com/fruitbars/config_claude.json
2024/09/09 22:10:37 config.go:197: config_claude json
2024/09/09 22:10:37 config.go:225: { false  {default http http://127.0.0.1:8080   0} 123456 random [] map[] map[] map[claude:[{ [claude-3-5-sonnet-20240620] true map[api_key:my-claude-api-key] [] https://api.anthropic.com/v1/messages map[] map[] {0 0 0 0 0} 0xc00044370e 0}]] {false  0 0} false []}
2024/09/09 22:10:37 config.go:238: {default http http://127.0.0.1:8080   0}
2024/09/09 22:10:37 config.go:246: read LoadBalancingStrategy ok, random
2024/09/09 22:10:37 config.go:254: read ServerPort ok, :9090
2024/09/09 22:10:37 config.go:259: log level:  
2024/09/09 22:10:37 config.go:119: Models: [claude-3-5-sonnet-20240620], service Timeout:0,Limit Timeout: 0, QPS: 0, QPM: 0, RPM: 0,Concurrency: 0
2024/09/09 22:10:37 config.go:268: GlobalModelRedirect:  map[]
2024/09/09 22:10:37 config.go:397: other support models: [claude-3-5-sonnet-20240620]
2024/09/09 22:10:37 config.go:275: SupportMultiContentModels:  [gpt-4o gpt-4-turbo glm-4v gemini-* yi-vision gpt-4o*]
2024/09/09 22:10:37 initializer.go:24: config.InitConfig ok
2024/09/09 22:10:37 logger.go:14: level mode 
2024/09/09 22:10:37 logger.go:31: level mode default prod
2024/09/09 22:10:37 logger.go:47: log plain-text format
2024/09/09 22:10:37 initializer.go:31: config.LogLevel ok
2024-09-09T22:10:37.784+0800    WARN    simple-one-api.git/main.go:63   check EnableWeb config  {"config.GSOAConf.EnableWeb": false}

The curl test works as follows:

$ curl http://127.0.0.1:9090/v1/chat/completions   -H "Content-Type: application/json"   -H "Authorization: Bearer 123456"   -d '{
  "model": "claude-3-5-sonnet-20240620",
  "messages": [{"role": "user", "content": "Hello, GPT-4!"}],
 "max_tokens":1024}'
{"id":"msg_01DybG9mpASVtLDiQCCXuC5S","object":"message","created":1725931067,"model":"claude-3-5-sonnet-20240620","choices":[{"index":0,"message":{"role":"assistant","content":"Hello! It's nice to meet you. How can I assist you today? Feel free to ask me any questions or let me know if you need help with anything."},"logprobs":null,"finish_reason":"stop"}],"usage":{"prompt_tokens":14,"completion_tokens":37,"total_tokens":51}}

But the test in gpt_academic failed as follows:

image

The corresponding stdout log of simple-one-api:

2024-09-10T09:39:28.281+0800    ERROR   mycommon/common_err_resp.go:27  Unexpected status code  {"status": 400, "body": "{\"type\":\"error\",\"error\":{\"type\":\"invalid_request_error\",\"message\":\"messages: Unexpected role \\\"system\\\". The Messages API accepts a top-level `system` parameter, not \\\"system\\\" as an input message role.\"}}"}
2024-09-10T09:39:28.281+0800    ERROR   handler/openai_claude_handler.go:84 sendClaudeRequest   {"error": "status 400: {\"type\":\"error\",\"error\":{\"type\":\"invalid_request_error\",\"message\":\"messages: Unexpected role \\\"system\\\". The Messages API accepts a top-level `system` parameter, not \\\"system\\\" as an input message role.\"}}"}
2024-09-10T09:39:28.281+0800    ERROR   handler/openai_claude_handler.go:51 status 400: {"type":"error","error":{"type":"invalid_request_error","message":"messages: Unexpected role \"system\". The Messages API accepts a top-level `system` parameter, not \"system\" as an input message role."}}   {"claudeServerURL": "https://api.anthropic.com/v1/messages", "claudeReq": {"model":"claude-3-5-sonnet-20240620","messages":[{"role":"system","content":"Serve me as a writing and programming assistant."},{"role":"user","content":"hello."}],"max_tokens":0,"stream":true,"temperature":1,"top_k":1,"top_p":1}, "oaiReq": {"model":"claude-3-5-sonnet-20240620","messages":[{"role":"system","content":"Serve me as a writing and programming assistant."},{"role":"user","content":"hello."}],"temperature":1,"top_p":1,"n":1,"stream":true}}
2024-09-10T09:39:28.282+0800    ERROR   handler/openai_handler.go:285   status 400: {"type":"error","error":{"type":"invalid_request_error","message":"messages: Unexpected role \"system\". The Messages API accepts a top-level `system` parameter, not \"system\" as an input message role."}}
hongyi-zhao commented 2 months ago

I've created https://github.com/fruitbars/simple-one-api/pull/73 which will fix the above problem, as shown below:

$ curl http://127.0.0.1:9090/v1/chat/completions   -H "Content-Type: application/json"   -H "Authorization: Bearer 123456"   -d '{
  "model": "claude-3-5-sonnet-20240620",
  "messages": [{"role": "user", "content": "Hello, GPT-4!"}],
 "max_tokens":1024}'
{"id":"msg_01SFC3tbYGjLsQ4nBzcMJooL","object":"message","created":1725936330,"model":"claude-3-5-sonnet-20240620","choices":[{"index":0,"message":{"role":"assistant","content":"Hello! It's nice to meet you. How can I assist you today? Feel free to ask me any questions or let me know if you need help with anything."},"logprobs":null,"finish_reason":"stop"}],"usage":{"prompt_tokens":14,"completion_tokens":37,"total_tokens":51}}

image

hongyi-zhao commented 2 months ago

hi,You don't need a colon. use this:

curl http://127.0.0.1:9090/v1/models/chatgpt-4o-latest -H "Content-Type: application/json" -H "Authorization: Bearer 123456" -d '{ "model": "chatgpt-4o-latest", "messages": [{"role": "user", "content": "Hello, GPT-4!"}] }'

I tried as follows, but still failed:

$ curl http://127.0.0.1:9090/v1/models
{
    "data": [
        {
            "id": "claude.ai/claude-3-5-sonnet-20240620",
            "object": "model",
            "created": 1726191949,
            "owned_by": "openai"
        },
        {
            "id": "my-claude-3-5-sonnet-20240620",
            "object": "model",
            "created": 1726191949,
            "owned_by": "openai"
        },
        {
            "id": "random",
            "object": "model",
            "created": 1726191949,
            "owned_by": "openai"
        }
    ],
    "object": "list"

$ curl http://127.0.0.1:9090/v1/models/my-claude-3-5-sonnet-20240620  -H "Content-Type: application/json"   -H "Authorization: Bearer 123456"   -d '{
  "model": "my-claude-3-5-sonnet-20240620",
  "messages": [{"role": "user", "content": "Hello, GPT-4!"}]
}'
{"error":"Path not found"}

On the other hand, if my model name contains a "/", such as "claude.ai/claude-3-5-sonnet-20240620", it would make the URL very strange. Therefore, I think this call method has no practical value.