64bit / async-openai

Rust library for OpenAI
https://docs.rs/async-openai
MIT License
1.09k stars 161 forks source link

Running the `assistants` example results in Error #221

Closed SimonCW closed 3 months ago

SimonCW commented 3 months ago

Running the example at https://github.com/64bit/async-openai/blob/532a60f63b679e14a1eb37a907e357c083a48c99/examples/assistants/src/main.rs results in the error below. Adding a few .context(...) helped me to narrow it down to the call to create a thread:

let thread = client
        .threads()
        .create(thread_request.clone())
        .await
        .context("Failed to create thread")?;

Error

2024-05-07T14:49:32.808024Z ERROR async_openai::error: failed deserialization of: File Not Found
Error: JSONDeserialize(Error("expected value", line: 1, column: 1))

The only change I made is in setting up the client. I'm running this against a local llamafile and I don't have an OpenAI key to test the raw example. This is how I set up the client:

    //create a client
    let local_conf = OpenAIConfig::new()
        .with_api_key("sk-no-key-required")
        .with_api_base("http://localhost:8080/v1");
    let client = Client::with_config(local_conf);
kitalia commented 3 months ago

response deserialisation is failing on my openai assistant implementation too, I get a deserialisation error similar to the one I posted when asked for an update a few days ago.....I wasn't able to find what is missing from my implementation...supposedly everything's implemented and does work on 0.20.0 but I couldn't find the breaking change in 0.21.0 other than a minor .role(MessageRole::User)

 Error: failed deserialization of: {
  "object": "list",
  "data": [
    {
      "id": "XXX",
      "object": "thread.message",
      "created_at": 1715093590,
      "assistant_id": "XXX",
      "thread_id": "XXX",
      "run_id": "XXX",
      "role": "assistant",
      "content": [
        {
          "type": "text",
          "text": {
            "value": "XXX---ANSWER HERE---XXX",
            "annotations": [
              {
                "type": "file_citation",
                "text": "【4:0†source】",
                "start_index": 587,
                "end_index": 599,
                "file_citation": {
                  "file_id": "file-XXX"
                }
              }
            ]
          }
        }
      ],
      "attachments": [],
      "metadata": {}
    },
    {
      "id": "XXX",
      "object": "thread.message",
      "created_at": 1715093587,
      "assistant_id": null,
      "thread_id": "XXX",
      "run_id": null,
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": {
            "value": "XXX---QUESTION HERE---XXX",
            "annotations": []
          }
        }
      ],
      "attachments": [],
      "metadata": {}
    }
  ],
  "first_id": "XXX",
  "last_id": "XXX",
  "has_more": false
}
64bit commented 3 months ago

@SimonCW It appears to be an issue with lamafile reponse "File Not Found", its not a valid response as per OpenAI API to deserialize.

64bit commented 3 months ago

@kitalia I believe your issue is separate - it appears that spec and actual response have a drift. Please try the following to see if your deserialization issue goes away:

Use a local git copy in your project and make quote field of FileCitation Optional - https://github.com/64bit/async-openai/blob/main/async-openai/src/types/message.rs#L147

That's the only field missing in the response you have provided - if so please let us know your findings - also PR is most welcome!

SimonCW commented 3 months ago

D'oh. Thank you @64bit ! Llamafile doesn't support the assistants endpoints but only the chat/completions. I assume that's the problem.

Closing