Open CakeCrusher opened 5 months ago
would love to support and IMO should be reasonably easy as we're already quite compatible. Let's tag with https://github.com/huggingface/chat-ui/labels/good%20first%20issue?
@julien-c Shifting to the Messages API should be simple but I was moreso thinking of offsetting the responsibility of the conversation storage to an open-source Assistants API (I am currently working on one on FastAPI here https://github.com/OpenGPTs-platform/assistants-api ).
One of the major challenges to this refactor is that in general OS Assistants APIs generally use Postgres (sql) as the entities already have a clear and defined structure and it leaves the door open to deep references. This is to say a migration from Mongo to Postgres may be needed. I am not entirely opposed to reworking the OS Assistants API to work with Mongo but the collection structures would need to be reworked anyways.
(Im getting back into it now) Running list of issues addressable out of the box by assistants API:
Hi @nsarrazin , I wanted to explore how we could collaborate in making chat-ui more work with OpenAI standards to make it more less opinionated over hosted inference provider. I need it as I am part of a team open-sourcing the GPTs platform https://github.com/OpenGPTs-platform and we will be leveraging chat-ui as the client. So I was hoping we could align our objectives so that we can have a healthy collaboration instead of just diverging. The main point I wanted to touch on is as follows.
Is there any interest in transforming the backend to one that follows the OpenAI assistants API structure so that we may better align ourselves to the OpenAI standard? Based on the disord announcement "...Message API with OpenAI compatibility for HF...", HF seems to signal that they are pushing in that direction so it would make sense to support that on the chat-ui. I havent looked too deep into the codebase but I imagine we will need to refactor the backend endpoints to support assistants API endpoints and then use the openai client to make the requests.
I am more than open to suggestions, and I look forward to exploring how we could collab!