nvms / wingman

Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman
ISC License
61 stars 10 forks source link

FEAT: Anthropic Claude support #1

Closed capdevc closed 1 year ago

capdevc commented 1 year ago

Any interest in supporting Anthropic's Claude model? The 100k token context window opens a lot of possibilities.

If so I might take a stab at it and send a PR.

Also, love how configurable your extension is.

nvms commented 1 year ago

yes! I absolutely plan to support Claude, just not quite there yet. if you get to it before I do, that's totally cool!

nvms commented 1 year ago

I stubbed out support for Claude using the new Provider API, and the OpenAIProvider can serve as an example for how to implement a new provider (if you are interested):

I still don't have access to Claude yet, so apart from the above, I haven't been able to really dig into this yet

capdevc commented 1 year ago

@nvms I've started taking an initial look at this. Full disclosure: I have exactly zero experience with JS/TS or vscode extensions so I'm kind of learning it and the relevant tooling on the go.

The main thing that's blocking me is the fact that the anthropic/claude API has no counterpart to the OpenAI conversation tracking stuff. It's much simpler/more limited. Each request exists on its own and is basically just a single blob of text with alternating User: ... and Assistant: ... blocks. So, for a conversation I'd basically need to manually build the next prompt by keeping the current state somewhere and appending new questions and responses at each step.

It's not clear to me where this state should live, or when/how to clear it.

nvms commented 1 year ago

no counterpart to the OpenAI conversation tracking stuff

ah, I wasn't aware of this. I think your method of preserving state is exactly what you'll want to do, then.

maybe something like this?: