Closed Jac-Zac closed 3 days ago
Latest commit: af8cf4ab86b1e9300a8d205b5ef8f85596733a4c
Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.
Click here to learn what changesets are, and how to add one.
Click here if you're a maintainer who wants to add a changeset to this PR
The changes across the codebase involve updates to model identifiers within the modelMap
configuration in several files. Existing "llama3" model entries have been updated to their "llama3.1" versions, and a new model for tool usage has been introduced. Additionally, the getAvailableModelChoicesGroq
function has been simplified by removing a parameter and adjusting its filtering logic to focus solely on LLM models.
Files | Change Summary |
---|---|
templates/components/settings/python/settings.py templates/types/streaming/express/src/controllers/engine/settings.ts templates/types/streaming/nextjs/app/api/chat/engine/settings.ts |
Updated model identifiers from "llama3" to "llama3.1" for two existing models and introduced a new entry for tool-usage preview. |
helpers/providers/groq.ts |
Simplified getAvailableModelChoicesGroq function by removing selectEmbedding parameter and adjusting model filtering to focus on LLMs. |
sequenceDiagram
participant User
participant ModelManager
participant Tool
User->>ModelManager: Request model
ModelManager->>ModelManager: Check modelMap
ModelManager->>Tool: If tool model requested
Tool-->>ModelManager: Provide tool-usage model
ModelManager-->>User: Return model
π In the meadow we hop and play,
New models bloom on this fine day!
"Llama3.1" now leads the way,
With tools to help us all sway.
So letβs cheer for the code so bright,
Upgrading dreams in the soft moonlight! πβ¨
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
Thanks @Jac-Zac we will finish this in https://github.com/run-llama/create-llama/pull/278
Adding support for the latest models from Meta that are available through the groq API
Summary by CodeRabbit
New Features
Bug Fixes