BoundaryML / baml

BAML is a language that helps you get structured data from LLMs, with the best DX possible. Works with all languages. Check out the promptfiddle.com playground
https://docs.boundaryml.com
Apache License 2.0
1.32k stars 49 forks source link

Environment Popup breaks the writing workflow in VSCode #1064

Closed hhimanshu closed 2 weeks ago

hhimanshu commented 3 weeks ago

Hello team,

BAML is fantastic! I discovered it this morning and already started adapting it with my own use case. So, great work on DX.

One thing, that caused break in the flow is when making changes in the vscode file, and if playground is open (which was part of the flow, it was already opened), as I type in the editor, the Environment window pops up and takes away the keyboard focus to playground from editor. It breaks the writing flow.

I am wondering if there is a way to hide Environment window completely. Please let me know

https://github.com/user-attachments/assets/cc72042c-3f61-4ab0-9413-28769152421e

hellovai commented 3 weeks ago

that looks incredibly annoying... we will patch that ASAP!

hhimanshu commented 3 weeks ago

Thank you @hellovai for your prompt response. I would be happy to test once the patch is release, let me know!

aaronvg commented 3 weeks ago

It will go out today -- If you want to get around this ASAP you can just add any string into any of the empty environment variables, save it, and it will stop popping up.

aaronvg commented 2 weeks ago

This is resolved (after you close it the first time it will not pop up again)

hhimanshu commented 2 weeks ago

@aaronvg , something else seems broken now

Unspecified error code: 2
reqwest::Error { kind: Request, source: "JsValue(TypeError: Failed to fetch\nTypeError: Failed to fetch\n    at __wbg_fetch_1e4e8ed1f64c7e28 (https://file+.vscode-resource.vscode-cdn.net/Users/harit/.vscode-insiders/extensions/boundary.baml-extension-0.63.0/web-panel/dist/assets/baml_schema_build.js:2706:17)\n    at baml_schema_build.wasm.reqwest::wasm::client::js_fetch::h4c26ca3258bac32c (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[7125]:0x65e923)\n    at baml_schema_build.wasm.reqwest::wasm::client::fetch::{{closure}}::h7d4b0171e0e42a9e (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[368]:0x1c0edf)\n    at baml_schema_build.wasm.<baml_runtime::internal::llm_client::primitive::openai::openai_client::OpenAIClient as baml_runtime::internal::llm_client::traits::chat::WithStreamChat>::stream_chat::{{closure}}::h51ceb5149de66fa7 (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[390]:0x1d4ff1)\n    at baml_schema_build.wasm.baml_runtime::BamlRuntime::run_test::{{closure}}::h07afa129ae80d409 (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[139]:0x389fc)\n    at baml_schema_build.wasm.wasm_bindgen_futures::future_to_promise::{{closure}}::{{closure}}::h08756f3ed50c4b5f (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[633]:0x27d2f8)\n    at baml_schema_build.wasm.wasm_bindgen_futures::queue::Queue::new::{{closure}}::h2bd4eab1d7cf1be5 (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[3472]:0x53a6e8)\n    at baml_schema_build.wasm.<dyn core::ops::function::FnMut<(A,)>+Output = R as wasm_bindgen::closure::WasmClosure>::describe::invoke::h4787a4021637fe52 (wasm://wasm/baml_schema_build.wasm-0262e19a:wasm-function[13023]:0x6ef088)\n    at __wbg_adapter_41 (https://file+.vscode-resource.vscode-cdn.net/Users/harit/.vscode-insiders/extensions/boundary.baml-extension-0.63.0/web-panel/dist/assets/baml_schema_build.js:240:12)\n    at real (https://file+.vscode-resource.vscode-cdn.net/Users/harit/.vscode-insiders/extensions/boundary.baml-extension-0.63.0/web-panel/dist/assets/baml_schema_build.js:222:16))" }

Check the webview network tab for more details. Command Palette -> Open webview developer tools.

See this video

aaronvg commented 2 weeks ago

Thanks for the video, looking into it on the newest version

aaronvg commented 2 weeks ago

can you try running ollama with:

OLLAMA_ORIGINS="*" ollama serve

I think something is wrong with the way we setup CORS on some new versions of the extension.

hhimanshu commented 2 weeks ago

can you try running ollama with:

OLLAMA_ORIGINS="*" ollama serve

I think something is wrong with the way we setup CORS on some new versions of the extension.

My bad, I completely missed this part. After updating ollama, I forgot about this. Thanks for catching up!