Open nedtwigg opened 8 months ago
I've been tinkering with Anthropic Haiku just in their playground, and for a lot of our evals it's actually doing better than GPT-4-turbo, it's 40x cheaper on input tokens, 20x cheaper on output, and subjectively way way way faster.
This is interesting indeed. I will take a look at it :)
For Claude you can use the following config:
val configOpenAI = OpenAIConfig(
// API key and other params
// ...
host = OpenAIHost("https://openrouter.ai/api/v1/"),
)
// ...
val chatCompletionRequest = ChatCompletionRequest(
// Other params and messages
// ...
model = ModelId("anthropic/claude-3-opus"),
// or anthropic/claude-3-sonnet or anthropic/claude-3-haiku
)
// Perform completion request...
Streaming output is supported.
Feature Description
It would be neat if this library could talk to Anthropic Claude.
Problem it Solves
The new Claude 3 models are impressive, and have an API very similar to OpenAI.
Proposed Solution
Make it possible to chat with Anthropic Claude models.
Additional Context
I think it's great this library is not langchain. I don't want a high-level abstraction, just to pipe this low-level abstraction into another LLM provider.