AnswerDotAI / cosette

Claudette's sister, a helper for OpenAI GPT
https://answerdotai.github.io/cosette/
Apache License 2.0
24 stars 5 forks source link

Strange bug with async client #11

Open ssslakter opened 4 weeks ago

ssslakter commented 4 weeks ago

Hi! I'm experiencing an issue when I use async openai client. It's not clear to me what is the root of the issue, but here's a minimal example

from openai import OpenAI, AsyncOpenAI
from cosette import *

acli = AsyncOpenAI()
chat = Chat(cli=Client('gpt-4o-mini', cli = acli))
await chat("Hi")

The code would fail with TypeError: Object of type coroutine is not JSON serializable.

But while debugging I tried to remove the line in the Chat.__call__ that appends to history self.h += mk_toolres(res, ns=self.tools) and it seems like it solves the issue somehow.

# this works
chat = Chat(cli=Client(model, cli = acli))
res = chat("Hi")
mk_toolres(res, ns=chat.tools)
await res

# this fails
chat = Chat(cli=Client(model, cli = acli))
res = chat("Hi")
chat.h += mk_toolres(res, ns=chat.tools)
await res

I'm running all code in a notebook

ssslakter commented 4 weeks ago

@jph00 @ncoop57 Could you please look into this? Or another questing is whether cosette is generally supposed to support async calls. I tested it on Client and it works fine. Also I played with structured a bit to make it work with async for faster requests. (it's more like a draft, there in my fork https://github.com/ssslakter/cosette/commit/e202ff72d1029a83edf6039d39dc1437cd23528c)

jph00 commented 3 weeks ago

No, we haven't done anything to make cosette async-compatible. But we'd be delighted to get PRs which do that!