nvms / wingman

Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman
ISC License
61 stars 10 forks source link

Model Installed in VS Code but No Settings No Logo in Sidebar Menu #35

Closed Worldboi closed 8 months ago

Worldboi commented 8 months ago

I have installed Wingman in Visual Studio Code Version: 1.85.0 on an M1 Pro MacBook running MacOS Sonoma but I can't find any settings for it and its logo doesn't show up in the left sidebar menu like other extensions do. Also when I open the Command Pallet (Command-Shift_P) I don't see the large control screen like the one shown in your README page https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman. Am I missing something obvious? I closed and restarted VS Code to no avail.

nvms commented 8 months ago

You probably figured it out by now, but I think by default the Wingman panel is down below, next to the "Debug console", "Problems", "Output", etc.:

Screenshot 2023-12-10 at 11 53 34 AM

I think the default key to reveal this panel is CMD/CTRL-J, or you can go to View -> Debug console:

Screenshot 2023-12-10 at 11 54 32 AM
nvms commented 8 months ago

You can of course drag/drop the panel to the left sidebar if you want, which is where I like to keep it.

nvms commented 8 months ago

I'm assuming this was just a case of "where is it?", instead of something actually not working, so I'm closing this. Please feel free to open this back up if I'm wrong.

Worldboi commented 8 months ago

Thanks for the hint about View > Debug Console, but I found that from (Command-Shift-P) and searching Wingman. However when I look for it in Settings -> Extensions it's not there. So there are no settings, just the tab for "Presets" is that right? Also, I didn't know about the drag and drop to the sidebar trick. Thanks. My other extensions don't seem to behave this way.

nvms commented 8 months ago

So there are no settings, just the tab for "Presets" is that right?

Yep. The majority of configuration happens in the "Presets" tab, but the other tabs are used for some light configuration as well - prompts, placeholders, modes.

However when I look for it in Settings -> Extensions it's not there.

There are no settings in Settings -> Extensions. Everything that is configurable is configurable from the UI.

Worldboi commented 8 months ago

Thanks, for your reply. Now to connect to LM Studio which is running in server mode: the default URL for the endpoint says http://localhost:1234/v1/chat/completions but my other project using Open Interpreter uses "http://localhost:1234/v1", so should I change the URL by deleting the /chat/completions part? Because at the moment my prompt messages aren't received by LM Studio.

nvms commented 8 months ago

http://localhost:1234/v1/chat/completions works for me. I would suggest using that endpoint with Wingman. It mimics the OpenAI /chat/completions API, which is what the OpenAI provider in Wingman is expecting to use.

I'm running LM Studio 0.2.8.

http://localhost:1234/v1 does not work for me. Instead, I see this error in their console: [2023-12-10 21:14:57.283] [ERROR] Unexpected endpoint or method. (POST /v1). Returning 200 anyway.

Worldboi commented 8 months ago

Thanks for your help.