issues
search
nvms
/
wingman
Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman
ISC License
64
stars
10
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Remove restriction on Ollama models
#40
tadq
closed
7 months ago
0
Add Claudev3 model support
#39
fredxfred
opened
8 months ago
0
Feature Request: Support Anthropic messages API
#38
onlurking
opened
9 months ago
0
New {{cursor}} placeholder for prompts
#37
Entaigner
opened
10 months ago
1
`Error: Failed to open completion stream: 429 Too Many Requests`
#36
akashagarwal7
closed
11 months ago
2
Model Installed in VS Code but No Settings No Logo in Sidebar Menu
#35
Worldboi
closed
11 months ago
8
The selected code is forcibly replaced.
#34
czkoko
closed
11 months ago
4
Feature Request: Integration of Azure OpenAI Credentials Support
#33
umutcanoner
opened
12 months ago
1
Feature Request: Enhance Contextual Understanding by Allowing Code File Integration
#32
umutcanoner
opened
12 months ago
0
Missing Implementation of 'file' Placeholder as Mentioned in README
#31
umutcanoner
closed
12 months ago
1
Discussion: modelfusion
#30
capdevc
opened
12 months ago
2
Using the new version with local open source backends
#29
synw
opened
12 months ago
5
Add .DS_Store and *.code-workspace to .gitignore
#28
capdevc
closed
12 months ago
0
Wingman v2.0.8 , Local model is not supporting
#27
NK-Aero
closed
12 months ago
4
Please include the julia language (.jl) extension also.
#26
MrBenzWorld
closed
12 months ago
1
A new theme sharing
#25
czkoko
closed
12 months ago
1
Great Job!
#24
ChadDa3mon
closed
12 months ago
1
Version 2
#23
nvms
closed
12 months ago
0
Add new providers: Koboldcpp and Goinfer
#22
synw
closed
1 year ago
0
New Koboldcpp provider
#21
synw
opened
1 year ago
10
New Goinfer provider
#20
synw
closed
12 months ago
3
Write function comment prompt: ask before injecting the response into the file
#19
synw
closed
12 months ago
1
Disable the copy icon when generating
#18
synw
closed
12 months ago
1
Context window and max_tokens management
#17
synw
opened
1 year ago
0
Support configurable inference params per prompt
#16
synw
closed
1 year ago
8
The spinning icon is still showed when a request is canceled
#15
synw
closed
1 year ago
3
Add configuration for openai response type: stream or buffer
#14
nvms
closed
12 months ago
0
Template format for local models
#13
synw
opened
1 year ago
6
How to debug the api responses for local model usage?
#12
synw
opened
1 year ago
4
Configurable timeout
#11
synw
closed
1 year ago
2
Custom Templates can not use call backs
#10
GarrettEHill
closed
1 year ago
1
Add support for Visual Studio Code for the Web
#9
GarrettEHill
opened
1 year ago
0
Restructure prompt categories to allow for different use context cases.
#8
GarrettEHill
closed
12 months ago
1
Add support for llama2
#7
GarrettEHill
closed
12 months ago
1
Allow users to define their own "provider" by decoupling the API style from predefined `Provider` types
#6
nvms
closed
12 months ago
0
Use SecretStorage
#5
nvms
closed
1 year ago
4
FEAT: API keys in VS Code secrets API
#4
capdevc
closed
1 year ago
1
Anthropic Claude support
#3
capdevc
closed
1 year ago
4
FEAT: LSP Provided Context
#2
capdevc
opened
1 year ago
6
FEAT: Anthropic Claude support
#1
capdevc
closed
1 year ago
4