issues
search
huggingface
/
llm-vscode
LLM powered development for VSCode
Apache License 2.0
1.23k
stars
133
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Issue with Self-Hosted LLM Integration Using vLLM
#146
AwayahShahid
opened
1 month ago
1
There will be two inference responses after users stopping editing.
#145
10901008-RoryHuang
opened
1 month ago
1
View requests and responses
#144
mina58
opened
4 months ago
7
What capabilities does this extension add on top of other OSS extensions?
#143
DanielAdari
opened
5 months ago
1
feat: set `starcoder2-15b` as default
#142
McPatate
closed
6 months ago
0
feat: update llm-ls to `0.5.3`
#141
McPatate
closed
6 months ago
0
Running llama-cpp-python OpenAI compatible server
#140
abasu0713
opened
7 months ago
8
Create context for inline suggestions
#139
FrenchBen
closed
4 months ago
2
BUG: LLM-vscode breaks core VSCode keybindings
#138
FrenchBen
closed
4 months ago
4
Inference api error: Service Unavailable
#137
vanschroeder
closed
5 months ago
9
[Feat. Req] Add toggle to enable/disable completion
#136
knoopx
opened
7 months ago
1
Add FIM config for CodeGemma
#135
postmasters
opened
7 months ago
0
Is it still intended and possible to use custom endpoint in 0.2.0?
#134
Kaschi14
closed
4 months ago
9
Add LlamaCpp support
#133
FredericoPerimLopes
opened
9 months ago
0
Support for Llamacpp
#132
FredericoPerimLopes
closed
6 months ago
6
OpenAI backend still creates HuggingFace-formatted request
#131
rggs
closed
7 months ago
3
fix(ci): remove i686-windows bin
#130
McPatate
closed
9 months ago
0
feat: `0.5.x`
#129
McPatate
closed
9 months ago
1
[BUG] Completion doesn't override parentheses
#128
DanielAdari
opened
9 months ago
8
Add support for properly interpreting `context.selectedCompletionInfo`
#127
spew
opened
10 months ago
2
Loading indicator in status bar
#126
HennerM
closed
9 months ago
0
Fix temperature config type
#125
HennerM
closed
9 months ago
1
Fix camelCase API compatibility
#124
HennerM
closed
9 months ago
1
Add template for DeepSeek coder
#123
HennerM
closed
9 months ago
0
Fix/changed params to camel case
#122
florian-guily
closed
9 months ago
2
Latest version is crashing when using autocomplete
#121
darolt
closed
8 months ago
3
[BUG] Continuous file change events
#120
pip25
closed
10 months ago
3
[BUG] Crash on new untitled VS Code file
#119
DataOps7
closed
8 months ago
3
No New Prevligies FLAG
#118
enterprisium
closed
9 months ago
1
feat: Add adaptors for ollama and openai
#117
noahbald
closed
9 months ago
2
LLM VS Code client: couldn't create connection to server |llm-ls failed
#116
raj2125
closed
11 months ago
6
Client LLM VS Code: connection to server is erroring. Shutting down server.
#115
IngoTB303
closed
9 months ago
4
Sending custom header with each API request
#114
Simon-Stone
closed
9 months ago
7
Weird behavior with "codellama/CodeLlama-13b-hf"
#113
icnahom
opened
11 months ago
7
fix: allow auto-completion on the first line
#112
antoinejeannot
closed
11 months ago
0
feat: add llm.requestDelay setting v2
#111
antoinejeannot
closed
11 months ago
2
Extension causes code-oss to crash
#110
frost19k
closed
8 months ago
3
Cannot destructure property 'request_id' of 'response' as it is undefined
#109
thomasa88
closed
12 months ago
3
Trying to connect to local text-generation-webgui server
#108
Falenos
closed
9 months ago
2
Extension panicking
#107
remyleone
closed
1 year ago
2
[Feature request] Adding a checker to see if a custom endpoint is working properly
#106
remyleone
opened
1 year ago
1
Empty response with custom api
#105
thanhnew2001
opened
1 year ago
3
Give too many <MID> <PRE> <SUF> inline response when load custom LLM model with llm-vscode-server
#104
bonuschild
opened
1 year ago
4
feat: add ```llm.requestDelay``` setting
#103
davidpissarra
closed
11 months ago
5
Which version of `llm-ls` is integrated? won't work on linux with glibc version dismatch
#102
bonuschild
closed
12 months ago
1
raise OS ERROR 123 on VSCode(windows)
#101
bonuschild
closed
1 year ago
5
How to generate the response from locally hosted end point in vscode?
#100
dkaus1
opened
1 year ago
1
Error decoding response body: expected value at line 1 column 1
#99
jalalirs
opened
1 year ago
6
style: remove a duplicate of in
#98
bufferoverflow
closed
9 months ago
0
Client not running on Remote VSCode
#97
NicolasAG
opened
1 year ago
8
Next