AbanteAI / mentat

Mentat - The AI Coding Assistant
https://mentat.ai
Apache License 2.0
2.42k stars 226 forks source link

Use Spice #543

Closed biobootloader closed 3 months ago

biobootloader commented 3 months ago

Current status: Mentat runs with both OpenAI and Anthropic, through Spice. LiteLLM proxies and AzureOpenAI should also work, although I haven't tested yet.

Still to do before merging:

Pull Request Checklist

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request introduces significant changes, notably the switch to using the Spice client for LLM API calls and modifications to utility functions to simplify their interfaces. While these changes can potentially improve the codebase's maintainability and performance, careful consideration is required to ensure that all features and error handling previously provided by the OpenAI client are adequately covered. Additionally, the direct dependency on the spice Git repository should be managed carefully to avoid potential instability.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request introduces significant architectural changes, particularly in integrating the Spice client for LLM API calls and simplifying utility function interfaces. While these changes have the potential to enhance the project's maintainability and efficiency, it's crucial to ensure comprehensive testing and consider the stability implications of directly depending on a Git repository for the spice package. Additionally, removing duplicated imports and verifying the correct integration of modified utility functions will help maintain code quality.

granawkins commented 3 months ago

The approach looks good to me, and this as a starting point.

I had to install anthropic manually. Maybe it needs to be in Spice's pyproject?

Also getting these errors after each response:

image

If Butler and Ragdaemon also use Spice directly (in addition to with Mentat), should they pass around an instance of Spice? Or use their own?

biobootloader commented 3 months ago

@granawkins

I had to install anthropic manually. Maybe it needs to be in Spice's pyproject?

it is in spice's requirements.txt, it should install once spice is coming from PyPI instead of github

Also getting these errors after each response

ah yeah OpenAI doesn't send token counts when you stream (anthropic does which is nice!). So those will need to be counted. I'm moving the token counting code from Mentat to Spice now.

If Butler and Ragdaemon also use Spice directly (in addition to with Mentat), should they pass around an instance of Spice? Or use their own?

It should always be fine to use their own instances. The only exception would be if we want Spice to handle rate limiting of calls

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request introduces significant changes with the integration of the Spice client for LLM API calls and adjustments to utility functions. It's crucial to ensure that all functionalities previously covered by the OpenAI client are adequately handled by Spice, including error handling and rate limiting. Additionally, the direct dependency on the spice Git repository should be managed carefully to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request introduces significant changes with the integration of the Spice client and adjustments to utility functions. It's crucial to ensure comprehensive testing, especially for error handling and rate limiting. Additionally, managing the direct dependency on the spice Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request marks a significant shift towards using the Spice client for LLM API interactions. It's crucial to ensure that the Spice client is correctly configured and integrated, paying close attention to error handling, rate limiting, and response parsing to maintain or improve upon the functionality previously provided by the OpenAI client. Additionally, managing the direct dependency on the spice Git repository and updating model compatibility warnings are important steps towards ensuring stability and guiding users effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This pull request introduces significant changes with the integration of the Spice client for LLM API calls. It's crucial to ensure comprehensive testing, especially for error handling, rate limiting, and response parsing. Additionally, managing the direct dependency on the spice Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

The pull request introduces significant changes with the integration of the Spice client for LLM API calls. It's crucial to ensure comprehensive testing, especially for error handling, rate limiting, and response parsing. Additionally, managing the direct dependency on the spice Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

The pull request introduces significant changes with the integration of the Spice client for LLM API calls. It's crucial to ensure comprehensive testing, especially for error handling, rate limiting, and response parsing. Additionally, managing the direct dependency on the spiceai Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

The transition to using the Spice client for LLM API calls is a significant change. It's crucial to ensure that this integration is thoroughly tested, particularly in areas such as error handling, rate limiting, and response parsing. Additionally, consider enhancing the logging mechanism for API calls to facilitate debugging and monitoring. Lastly, keep an eye on updates to the spiceai library to leverage any improvements or bug fixes.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

The pull request introduces significant changes, particularly the integration of the Spice client for LLM API calls. Key areas to focus on include robust error handling, comprehensive testing of streaming functionality, regular updates to leverage improvements in the spiceai library, and enhanced logging for diagnostics and troubleshooting. Ensuring these aspects are thoroughly addressed will contribute to the stability and reliability of the application.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

This review covers a wide range of changes related to the integration of the Spice client and adjustments to utility functions. It's crucial to ensure comprehensive testing, especially for error handling and rate limiting. Additionally, managing the direct dependency on the spice Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.

mentatbot[bot] commented 3 months ago

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. Please Reply with feedback.

The pull request introduces significant changes, particularly the integration of the Spice client for LLM API calls. It's crucial to ensure comprehensive testing, especially for error handling, rate limiting, and response parsing. Additionally, managing the direct dependency on the spice Git repository carefully is important to maintain stability. The updates to model compatibility warnings are a positive step towards guiding users more effectively.