RussellCanfield / wingman-ai

An open source AI coding assistant VSCode extension. Works with Ollama, HuggingFace, OpenAI and Anthropic
MIT License
150 stars 13 forks source link

Add ability to exclude context #42

Closed tadq closed 6 days ago

tadq commented 4 months ago

Wingman AI by default uses current file as context for the request to LLM.

It would be nice to just send general request/question without any context because:

  1. Context slows down response from LLM for general questions.
  2. If you do not want any context you are forced to select blank line(s) in text editor.
harlenalvarez commented 4 months ago

Have you tried changing the chat context length to 0 on the settings?

tadq commented 4 months ago

I only want to do occasional no-context queries to LLMs. Not all the queries.

harlenalvarez commented 4 months ago

How would you recommend exposing this option? A toggle in the chat window or did you have something else in mind?

tadq commented 4 months ago

Toggle is a great idea. Just toggle to provide source code context YES/no.

harlenalvarez commented 4 months ago

I'm working on a small POC, but as soon as I am done I'll add it