Closed tadq closed 6 days ago
Have you tried changing the chat context length to 0 on the settings?
I only want to do occasional no-context queries to LLMs. Not all the queries.
How would you recommend exposing this option? A toggle in the chat window or did you have something else in mind?
Toggle is a great idea. Just toggle to provide source code context YES/no.
I'm working on a small POC, but as soon as I am done I'll add it
Wingman AI by default uses current file as context for the request to LLM.
It would be nice to just send general request/question without any context because: