tryAGI / LangChain

C# implementation of LangChain. We try to be as close to the original as possible in terms of abstractions, but are open to new entities.
https://tryagi.github.io/LangChain/
MIT License
507 stars 78 forks source link

feat:Updated OpenRouter models #241

Closed HavenDV closed 4 months ago

HavenDV commented 4 months ago

Created by Github Actions

Summary by CodeRabbit

sweep-ai[bot] commented 4 months ago

Apply Sweep Rules to your PR?

This is an automated message generated by Sweep AI.

coderabbitai[bot] commented 4 months ago

Walkthrough

The OpenRouterModelProvider underwent a significant update, involving the addition of new models, extension of existing ones, and renaming or reconfiguring some models. The AllModels.cs file also saw additions and updates to model classes, alongside the removal and reorganization of others, enhancing the system's capabilities and efficiency.

Changes

File Path Change Summary
.../OpenRouterModelProvider.cs Added and extended models, renamed and updated configurations of existing models.
.../Predefined/AllModels.cs New models added, existing models updated with new identifiers and providers, one model removed, reorganization of model classes.

🐰✨ A hop, a skip, a code deploy, In OpenRouter's land, oh boy! New models spring like daisies wild, Old ones groomed, none reviled. Cheers to code that grows and shifts, In digital fields, our spirits lift! πŸŒΌπŸš€ 🐰✨


Recent Review Details **Configuration used: CodeRabbit UI** **Review profile: CHILL**
Commits Files that changed from the base of the PR and between c59ee9fd286e2d8af2b496c44a47cc35b3c88abe and b850f8947fe5b20f67f11612b16aaafdd3a1d537.
Files selected for processing (3) * src/Providers/OpenRouter/src/OpenRouterModelIds.cs (7 hunks) * src/Providers/OpenRouter/src/OpenRouterModelProvider.cs (6 hunks) * src/Providers/OpenRouter/src/Predefined/AllModels.cs (10 hunks)
Files not summarized due to errors (1) * src/Providers/OpenRouter/src/OpenRouterModelIds.cs: Error: Message exceeds token limit
Additional comments not posted (72)
src/Providers/OpenRouter/src/OpenRouterModelProvider.cs (3)
`12-17`: Ensure that the model identifiers and configurations are consistent with the intended specifications. --- `9-27`: > :memo: **NOTE** > This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [24-34] Check the precision of floating-point values for model parameters to avoid potential rounding errors or unintended behavior. --- `91-100`: > :memo: **NOTE** > This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [55-97] Review the newly added models for any potential duplicates or inconsistencies in naming and parameters.
src/Providers/OpenRouter/src/Predefined/AllModels.cs (6)
`3-6`: Ensure that the constructor parameters and base class initialization are correctly implemented for `NousCapybara7BFreeModel`. --- `28-31`: Verify that the `Cinematika7BAlphaFreeModel` class is correctly set up with the appropriate model identifier and provider. --- `63-66`: Check for consistency in the model parameters and documentation for `Mythomax13BModel`. --- `113-116`: Confirm that the extended model `Mythomax13BExtendedModel` is correctly configured and documented. --- `218-221`: Ensure that the `MetaLlama370BInstructModel` is correctly implemented with the right identifiers and parameters. --- `428-431`: Double-check the implementation of `Mixtral8X7BInstructModel` to ensure it aligns with the intended model specifications.
src/Providers/OpenRouter/src/OpenRouterModelIds.cs (63)
`9-13`: Ensure that the summary for `NousCapybara7BFree` includes a link to the rate limits documentation. --- `15-19`: The summary for `Mistral7BInstructFree` mentions "For v0.2, use this model." but does not provide a link or identifier for v0.2. Consider adding a specific reference or link to improve clarity. --- `21-25`: The summary for `OpenChat35Free` is clear and concise. No changes needed. --- `27-33`: The summary for `Mythomist7BFree` includes a `#merge` tag but does not explain what models are being merged. Consider adding details about the models involved in the merge for better clarity. --- `6-38`: > :memo: **NOTE** > This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [35-46] The summary for `ToppyM7BFree` is detailed and includes a list of merged models, which is good for transparency. However, ensure that all model names are correctly spelled and that the links to their respective sources are provided if available. --- `48-53`: The summary for `Cinematika7BAlphaFree` mentions that the model is under development and to check the OpenRouter Discord for updates. Ensure that a link to the Discord server or a specific channel is provided for user convenience. --- `54-59`: The summary for `GoogleGemma7BFree` is clear and includes a reference to Google's Gemma Terms of Use. Ensure that a link to the terms of use is provided for easy access. --- `61-69`: The summary for `Psyfighter13B` is clear and includes details about the merge. Ensure that the model identifiers mentioned are correct and that any external resources are linked appropriately. --- `75-382`: > :memo: **NOTE** > This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [70-80] The summary for `PsyfighterV213B` provides a good description of the model's capabilities and the data it was trained on. Ensure that the model identifiers mentioned are correct and that any external resources are linked appropriately. --- `83-85`: The summary for `NeuralChat7BV31` is concise and informative. No changes needed. --- `91-98`: The summary for `NousHermes2Vision7BAlpha` includes a `#multimodal` tag, which is appropriate given the model's capabilities. Ensure that the project leaders mentioned are correctly spelled and that their roles are accurately described. --- `103-108`: The summary for `MetaLlamaV213BChat` is clear and concise. No changes needed. --- `111-113`: The summary for `PygmalionMythalion13B` includes a `#merge` tag. Ensure that the models being merged are specified for clarity. --- `116-118`: The summary for `Xwin70B` is clear and includes a reference to its performance on AlpacaEval. Ensure that a link to the AlpacaEval results is provided for verification. --- `125-127`: The summary for `Goliath120B` includes a `#merge` tag and credits to individuals involved in the development. Ensure that the contributions of each individual are clearly stated and that their roles are accurately described. --- `131-133`: The summary for `Noromaid20B` includes `#merge` and `#uncensored` tags. Ensure that the implications of these tags are clearly explained and that the content is appropriate for all audiences. --- `138-140`: The summary for `Mythomist7B` includes a `#merge` tag. Ensure that the models being merged are specified for clarity. --- `144-146`: The summary for `MidnightRose70B` is clear and includes a reference to its predecessors. Ensure that the model names mentioned are correct and that any external resources are linked appropriately. --- `150-152`: The summary for `Fimbulvetr11BV2` is clear and includes options for using different formats. Ensure that the formats mentioned are correctly spelled and that links to their descriptions are provided if available. --- `156-158`: The summary for `RemmSlerp13BExtended` includes a note about it being an extended-context version. Ensure that the differences between this version and the standard version are clearly explained. --- `161-164`: The summary for `Mythomax13BExtended` is similar to that for `RemmSlerp13BExtended`. Ensure that the descriptions are consistent and that the extended-context features are clearly explained. --- `167-172`: The summary for `MetaLlama38BInstructExtended` is detailed and includes a reference to Meta's Acceptable Use Policy. Ensure that a link to the policy is provided for easy access. --- `175-177`: The summary for `MancerWeaverAlpha` is clear and specifies its intended use. Ensure that the level of coherence and memory expected from this model is clearly stated. --- `180-182`: The summary for `NousHermes13B` is concise and informative. No changes needed. --- `186-188`: The summary for `NousCapybara7B` is clear and includes a reference to the techniques used for more consistent control. Ensure that the term "unalignment techniques" is clearly defined or linked to additional resources for clarity. --- `191-193`: The summary for `MetaCodellama34BInstruct` is clear and includes a description of the model's capabilities. Ensure that the term "extensive input contexts" is clearly defined or linked to additional resources for clarity. --- `196-198`: The summary for `MetaCodellama70BInstruct` is similar to that for `MetaCodellama34BInstruct`. Ensure that the descriptions are consistent and that the capabilities of the 70B model are clearly distinguished from the 34B model. --- `201-203`: The summary for `PhindCodellama34BV2` is clear and includes a comparison to GPT-4. Ensure that the benchmarks mentioned are specified and that links to the results are provided for verification. --- `206-208`: The summary for `OpenHermes2Mistral7B` is detailed and includes a description of the model's chat skills. Ensure that the term "system prompt capabilities" is clearly defined or linked to additional resources for clarity. --- `212-214`: The summary for `OpenHermes25Mistral7B` includes a discussion of the training data ratio. Ensure that the terms "code instruction" and "non-code benchmarks" are clearly defined or linked to additional resources for clarity. --- `217-219`: The summary for `RemmSlerp13B` includes a `#merge` tag. Ensure that the models being merged are specified for clarity. --- `222-224`: The summary for `Cinematika7BAlpha` is similar to an earlier entry for this model. Ensure that the descriptions are consistent and that the development status is clearly stated. --- `227-229`: The summary for `Yi34BChat` is clear and includes a reference to its instruct-tuning. Ensure that the term "instruct-tuned" is clearly defined or linked to additional resources for clarity. --- `232-234`: The summary for `Yi34BBase` is concise and informative. No changes needed. --- `237-239`: The summary for `Yi6BBase` is concise and informative. No changes needed. --- `243-245`: The summary for `StripedhyenaNous7B` is detailed and includes a description of the new architecture used. Ensure that the terms "attention mechanisms" and "gated convolutions" are clearly defined or linked to additional resources for clarity. --- `249-251`: The summary for `StripedhyenaHessian7BBase` is similar to that for `StripedhyenaNous7B`. Ensure that the descriptions are consistent and that the architectural details are clearly explained. --- `255-257`: The summary for `Mixtral8X7BBase` includes a `#moe` tag. Ensure that the term "Sparse Mixture of Experts" is clearly defined or linked to additional resources for clarity. --- `261-263`: The summary for `NousHermes2Yi34B` is clear and includes a comparison to past models. Ensure that the benchmarks mentioned are specified and that links to the results are provided for verification. --- `268-270`: The summary for `NousHermes2Mixtral8X7BSft` includes a `#moe` tag and a reference to the training data. Ensure that the term "supervised finetune" is clearly defined or linked to additional resources for clarity. --- `274-276`: The summary for `NousHermes2Mistral7BDpo` is detailed and includes a reference to the Direct Preference Optimization (DPO) technique. Ensure that the term "DPO" is clearly defined or linked to additional resources for clarity. --- `279-283`: The summary for `MetaLlama370BInstruct` is clear and includes a reference to Meta's Acceptable Use Policy. Ensure that a link to the policy is provided for easy access. --- `286-288`: The summary for `MistralOpenOrca7B` is concise and informative. No changes needed. --- `291-293`: The summary for `HuggingFaceZephyr7B` is clear and includes a reference to the Direct Preference Optimization (DPO) technique. Ensure that the term "DPO" is clearly defined or linked to additional resources for clarity. --- `297-299`: The summary for `OpenAiGpt35Turbo` is clear and includes a reference to the training data. Ensure that the training data timeframe is accurately described and that any external resources are linked appropriately. --- `303-305`: The summary for `OpenAiGpt35Turbo16K0125` is detailed and includes a reference to a bug fix. Ensure that the bug fix is clearly explained and that any external resources are linked appropriately. --- `308-310`: The summary for `OpenAiGpt35Turbo16K` is clear and includes a reference to the increased context length. Ensure that the term "context length" is clearly defined or linked to additional resources for clarity. --- `314-316`: The summary for `OpenAiGpt4Turbo` is clear and includes a reference to the latest version of the model. Ensure that the model version mentioned is correct and that any external resources are linked appropriately. --- `320-322`: The summary for `OpenAiGpt4TurboPreview` includes a note about rate limiting. Ensure that the implications of rate limiting are clearly explained and that any external resources are linked appropriately. --- `325-327`: The summary for `OpenAiGpt4` is clear and includes a reference to the training data. Ensure that the training data timeframe is accurately described and that any external resources are linked appropriately. --- `330-332`: The summary for `OpenAiGpt432K` is clear and includes a reference to the increased context length. Ensure that the term "context length" is clearly defined or linked to additional resources for clarity. --- `337-339`: The summary for `OpenAiGpt4Vision` includes a `#multimodal` tag. Ensure that the model's vision capabilities are clearly explained and that any external resources are linked appropriately. --- `342-344`: The summary for `OpenAiGpt35TurboInstruct` is clear and includes a reference to instructional prompts. Ensure that the term "instructional prompts" is clearly defined or linked to additional resources for clarity. --- `347-349`: The summary for `GooglePalm2Chat` is clear and includes a reference to the model's capabilities. Ensure that the terms "multilingual," "reasoning," and "coding" are clearly defined or linked to additional resources for clarity. --- `352-354`: The summary for `GooglePalm2CodeChat` is clear and includes a reference to the model's capabilities. Ensure that the terms "chatbot conversations" and "code-related questions" are clearly defined or linked to additional resources for clarity. --- `357-359`: The summary for `GooglePalm2Chat32K` is similar to that for `GooglePalm2Chat`. Ensure that the descriptions are consistent and that the increased context length is clearly explained. --- `362-364`: The summary for `GooglePalm2CodeChat32K` is similar to that for `GooglePalm2CodeChat`. Ensure that the descriptions are consistent and that the increased context length is clearly explained. --- `369-371`: The summary for `GoogleGeminiPro10` is clear and includes a reference to Google's Gemini Terms of Use. Ensure that a link to the terms of use is provided for easy access. --- `377-379`: The summary for `GoogleGeminiProVision10` includes a `#multimodal` tag. Ensure that the model's capabilities are clearly explained and that any external resources are linked appropriately. --- `394-396`: The summary for `GoogleGeminiPro15Preview` includes a `#multimodal` tag and a note about the model being in preview. Ensure that the implications of the preview status are clearly explained and that any external resources are linked appropriately. --- `399-401`: The summary for `PerplexityPplx70BOnline` includes a `#online` tag. Ensure that the model's online capabilities are clearly explained and that any external resources are linked appropriately. --- `404-406`: The summary for `PerplexityPplx7BOnline` includes a `#online` tag. Ensure that the model's online capabilities are clearly explained and that any external resources are linked appropriately. --- `409-411`: The summary for `PerplexityPplx7BChat` is clear and includes a reference to the model's capabilities. Ensure that the terms "chat model" and "7 billion parameters" are clearly defined or linked to additional resources for clarity.
--- Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
Share - [X](https://twitter.com/intent/tweet?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A&url=https%3A//coderabbit.ai) - [Mastodon](https://mastodon.social/share?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A%20https%3A%2F%2Fcoderabbit.ai) - [Reddit](https://www.reddit.com/submit?title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&text=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code.%20Check%20it%20out%3A%20https%3A//coderabbit.ai) - [LinkedIn](https://www.linkedin.com/sharing/share-offsite/?url=https%3A%2F%2Fcoderabbit.ai&mini=true&title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&summary=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code)
Tips ### Chat There are 3 ways to chat with [CodeRabbit](https://coderabbit.ai): - Review comments: Directly reply to a review comment made by CodeRabbit. Example: - `I pushed a fix in commit .` - `Generate unit testing code for this file.` - `Open a follow-up GitHub issue for this discussion.` - Files and specific lines of code (under the "Files changed" tab): Tag `@coderabbitai` in a new review comment at the desired location with your query. Examples: - `@coderabbitai generate unit testing code for this file.` - `@coderabbitai modularize this function.` - PR comments: Tag `@coderabbitai` in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples: - `@coderabbitai generate interesting stats about this repository and render them as a table.` - `@coderabbitai show all the console.log statements in this repository.` - `@coderabbitai read src/utils.ts and generate unit testing code.` - `@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.` Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. ### CodeRabbit Commands (invoked as PR comments) - `@coderabbitai pause` to pause the reviews on a PR. - `@coderabbitai resume` to resume the paused reviews. - `@coderabbitai review` to trigger a review. This is useful when automatic reviews are disabled for the repository. - `@coderabbitai resolve` resolve all the CodeRabbit review comments. - `@coderabbitai help` to get help. Additionally, you can add `@coderabbitai ignore` anywhere in the PR description to prevent this PR from being reviewed. ### CodeRabbit Configration File (`.coderabbit.yaml`) - You can programmatically configure CodeRabbit by adding a `.coderabbit.yaml` file to the root of your repository. - Please see the [configuration documentation](https://docs.coderabbit.ai/guides/configure-coderabbit) for more information. - If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: `# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json` ### Documentation and Community - Visit our [Documentation](https://coderabbit.ai/docs) for detailed information on how to use CodeRabbit. - Join our [Discord Community](https://discord.com/invite/GsXnASn26c) to get help, request features, and share feedback. - Follow us on [X/Twitter](https://twitter.com/coderabbitai) for updates and announcements.