e2b-dev / fragments

Open-source Next.js template for building apps that are fully generated by AI. By E2B.
https://fragments.e2b.dev
Apache License 2.0
3.48k stars 455 forks source link

Custom Anthropic baseURL can't response output result #54

Open pzc163 opened 2 months ago

pzc163 commented 2 months ago

in models.ts I change the anthropic and openai's baseURL as below shown:

export function getModelClient(model: LLMModel, config: LLMModelConfig) { const { id: modelNameString, providerId } = model const { apiKey, baseURL } = config

const providerConfigs = { anthropic: () => createOpenAI({ apiKey: apiKey || process.env.ANTHROPIC_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString), openai: () => createOpenAI({ apiKey: apiKey || process.env.OPENAI_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),

The application can run and the LLM response can be seen. But it won't give me back the result of running the code and can't preview. The log shows:

model { id: 'claude-3-5-sonnet-20240620', provider: 'Anthropic', providerId: 'anthropic', name: 'Claude 3.5 Sonnet', multiModal: true } config { model: 'claude-3-5-sonnet-20240620' } POST /api/chat 200 in 31541ms

I don't have a default anthropic API Keys, so how can i do to solve this problem?

linear[bot] commented 2 months ago

E2B-661 custom Anthropic baseURL can't response output result

mishushakov commented 2 months ago

Have you set your E2B_API_KEY in environment variable?

pzc163 commented 2 months ago

Have you set your E2B_API_KEY in environment variable?

Yes,E2B_API_KEY has been set already