Open Fried-Squid opened 1 month ago
As I said in the PR, I'm happy to look at this myself but I'm not sure how to proceed with implementing this on the frontend, as I assume this would require changes to the JSON schema for blocks as rather than just having an Enum which the dropdown selects on, the Model dropdown now depends on the Provider dropdown. If someone could advise on how would be best to approach this, I'd be happy to try and get a PR out soon
Duplicates
Summary 💡
The LLM call blocks should have a dropdown for model and provider, in order to enable users to use specific providers for models available on multiple providers, such as choosing between Ollama and Grok for llama models.
@ntindle @Bentlybro continued from convo in the PR
Examples 🌈
No response
Motivation 🔦
This allows us to implement more customization for LLM providers and models.