Open elithrar opened 5 days ago
OK, looks like I missed configuring the language_models.PROVIDER.available_models
to set up the custom model names per https://zed.dev/docs/assistant/configuration#openai-custom-models
I'm now getting a error re: the ResponseStreamResult — the zed log doesn't give me any output to debug with that I can see:
"language_models": {
"openai": {
"available_models": [
{
"display_name": "@cf/meta/llama-3.2-3b-instruct",
"name": "@cf/meta/llama-3.2-3b-instruct",
"max_tokens": 128000
},
{
"display_name": "@cf/meta/llama-3.1-70b-instruct",
"name": "@cf/meta/llama-3.1-70b-instruct",
"max_tokens": 128000
}
],
"version": "1",
"api_url": "https://api.cloudflare.com/client/v4/accounts/d458dbe698b8eef41837f941d73bc5b3/ai/v1"
}
}
Can you try your api_url
without the /v1
?
@notpeter Doesn't work (as expected) — /v1
is part of the route - without /v1
there's no server-side route.
Note that:
Is there an easy way to log the response body that the openai
extension is seeing without rebuilding / running a dev build and adding log output? e.g. here: https://github.com/zed-industries/zed/blob/781fff220c4e282c973d48e4f4dae909a760411e/crates/open_ai/src/open_ai.rs#L363-L392
Apologies, I was thinking of our Anthropic API which confusingly appends the v1: https://github.com/zed-industries/zed/blob/92c29be74cc2ac09dfe0d71d5a1048121b6ab4c6/crates/anthropic/src/anthropic.rs#L159
To confirm, is there anything notable in the Zed log (~/Library/Logs/Zed/Zed.log
)?
My recommendation is to make a dev build and add some dbg!()
statements. It's pretty easy to get a dev environment setup, a few pre-reqs and and then cargo run -- project_dir
and away you go. https://zed.dev/docs/development/macos
If you can't figure it out, in the coming days I'll try to standup a similar CF setup an see if I can reproduce or get things working.
Let me sprinkle some debug! macros throughout and see why the response isn’t matching the struct.
Will report back.
On Sun, Oct 20, 2024 at 10:31 Peter Tripp @.***> wrote:
Apologies, I was thinking of our Anthropic API which confusingly appends the v1:
To confirm, is there anything notable in the Zed log ( ~/Library/Logs/Zed/Zed.log)?
My recommendation is to make a dev build and add some dbg!() statements. It's pretty easy to get a dev environment setup, a few pre-reqs and and then cargo run -- project_dir and away you go. https://zed.dev/docs/development/macos
If you can't figure it out, in the coming days I'll try to standup a similar CF setup an see if I can reproduce or get things working.
— Reply to this email directly, view it on GitHub https://github.com/zed-industries/zed/issues/19346#issuecomment-2425014702, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAEQ4AULVFZIVL7OHXTWM3Z4O5FNAVCNFSM6AAAAABQDEMCL6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRVGAYTINZQGI . You are receiving this because you authored the thread.Message ID: @.***>
Check for existing issues
Describe the bug / provide steps to reproduce it
Summary: Attempting to override the
default_model
does not apply when using theopenai
provider - it continues to attempt to set the model asgpt-3.5-turbo
.Repro:
default_model
for the OpenAI providerapi_url
gpt-3.5-turbo
model instead of the"@cf/meta/llama-3.2-3b-instruct"
model set inassistant.default_model.model
insettings.json
.Note: I work at Cloudflare and thus was able to see the request our API infra accepted (and rejected due to the model mismatch).
I can see that the model should be passed per https://github.com/zed-industries/zed/blob/main/crates/open_ai/src/open_ai.rs#L159 and https://github.com/zed-industries/zed/blob/main/crates/open_ai/src/open_ai.rs#L306-L312 but can't see where it the settings override is getting reset/ignored.
Relevant
settings.json
Environment
Zed: v0.156.2 (Zed) OS: macOS 14.7.0 Memory: 32 GiB Architecture: aarch64
If applicable, add mockups / screenshots to help explain present your vision of the feature
Assistant error:
Model selection resets/never includes the default model I set:
If applicable, attach your Zed.log file to this issue.
Zed.log