Open jlewi opened 1 week ago
How do we want to handle Configuration for specifying the model to use? There are two different concepts
We don't want to duplicate LLMProvider settings We want to specify LLMProvider settings like BaseURL and APIKey once
So one option would be to have an llm provider section
type ModelProvider {
OpenAI OpenAIConfig
Anthropic AnthropicConfig
Replicate ReplicateConifg
}
We could then configure models by specifying the provider and parameters
type Model {
OpenAIModel OpenAIModelConfig
AnthropicModel AnthorpricModelConfig
}
Should it be a single model or should we have different structures for different classes of models; e.g.
type LLMConfig {
OpenAILLM
AnthropicLLM
}
type EmbeddingConfig {
OpenAIEmbedding
AnthropicEmbbedding
}
Support using LLAMA3 on Replicate as the AI.