First of all, thank you for creating and open sourcing this awesome library. Would be awesome if we could define and use local models, maybe through the API feature of "LM Studio" or similar? The API created there is almost identical to the one of OAI.
This library will not support local LLMs due to several key factors:
Function Calling: Essential for executing tools. OpenAI's models are specifically trained to call functions, eliminating the need for any extra prompts. This is crucial for making agents reliable in production.
Document Uploads: The Assistants API enables direct document uploads and storage within each agent, which allows you to easily expand your agent swarms. Implementing this with open-source models would require a complex vector store management system.
Future Capabilities: OpenAI is likely to introduce more advanced features to the Assistants API, such as built-in conversation memory, among others. With their first mover advantage, it might be extremely hard for any Open Source LLMs to ever catch up, unfortunately.
First of all, thank you for creating and open sourcing this awesome library. Would be awesome if we could define and use local models, maybe through the API feature of "LM Studio" or similar? The API created there is almost identical to the one of OAI.