Open ilikerobots opened 1 month ago
Hi @ilikerobots , how is your assistant built? You mean an Assistant created via the OpenAI API? Or a Langchain agent?
If a Langchain agent, you can override methods like get_llm
to provide the existing agent chain: https://vintasoftware.github.io/django-ai-assistant/latest/reference/assistants-ref/#django_ai_assistant.helpers.assistants.AIAssistant.get_llm
I believe Langchain can also use existing OpenAI API assistants. But we would have to check the docs on how to build a chain that uses the existing assistant. See: https://python.langchain.com/v0.1/docs/modules/agents/agent_types/openai_assistants/#using-existing-assistant
Then you can get the existing assistant and perhaps replace agent declaration here: https://github.com/vintasoftware/django-ai-assistant/blob/da8f104be31c2ef38f093c01920d75ee05b838c0/django_ai_assistant/helpers/assistants.py#L436
You can do that by inheriting from AIAssistant
and overriding as_chain
.
@fjsj , appreciate the comments. I did mean a pre-defined OpenAI assistant, not a langchain agent. I'll look into this, but langchain is all new to me so it's taking a while. Feel free to close.
(btw, great job to those involved in this project: i'm deploying a help desk response assistant today using this)
Awesome @ilikerobots, thanks! We plan to support pre-defined OpenAI assistants, so I will leave the issue open for now.
At least some documentation about this is necessary, if we don't change the library. But we may be able to do some matching via assistant ID and ensure we're using the pre-defined assistant from OpenAI.
If I've already built an assistant in OpenAI, is it possible to use it for messages with django-ai-assistant?