issues
search
huggingface
/
llm-ls
LSP server leveraging LLMs for code completion (and more?)
Apache License 2.0
540
stars
41
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
issue 45 fix - use SystemTime instead of instant for unauthenticated warning
#100
Ovich
closed
2 days ago
1
Can not using the ollama in docker container. ERROR: [LLM] http error
#99
meicale
opened
3 weeks ago
1
Use as backend for chat-style UI
#98
raine
opened
1 month ago
1
Can't process response from llamacpp server
#97
gergap
opened
1 month ago
2
fix(ci): update actions to use nodejs 20
#96
McPatate
closed
1 month ago
1
feat: add backend url path completion
#95
McPatate
closed
1 month ago
1
feat: add `llama.cpp` backend
#94
McPatate
closed
1 month ago
1
[LLM] missing field `request_params`
#93
Terr2048
opened
2 months ago
1
Deepseek Coder not working
#92
rhusiev
opened
3 months ago
0
Handle multiple initialized workspaces in get Completions
#91
Wats0ns
opened
3 months ago
0
Respect XDG environment variables
#90
life00
opened
3 months ago
1
Proposal: Launching LLM server as a daemon
#89
blmarket
opened
4 months ago
2
Feature/multiple encodings handled
#88
jeremyelalouf
closed
4 months ago
8
Can't accept completions
#87
Freaksed
closed
1 month ago
2
Added batch embedding computing
#86
Wats0ns
closed
4 months ago
0
test: fix invalid deserialization
#85
McPatate
closed
4 months ago
1
fix: namelss file crash
#84
McPatate
closed
4 months ago
0
fix: `AcceptCompletionParams` -> `RejectCompletionParams`
#83
McPatate
closed
4 months ago
0
refactor: cleanup unused code
#82
McPatate
closed
4 months ago
0
Add Llamacpp support
#81
FredericoPerimLopes
closed
1 month ago
5
When the backend is 'tgi', `build_url(...)` should append `/generate` to the URL
#80
spew
closed
1 month ago
0
feat: Add Llamacpp support
#79
FredericoPerimLopes
closed
4 months ago
0
fix: always set `return_full_text` to false for better UX
#78
McPatate
closed
4 months ago
0
Use rustls-tls-native-roots to allow for OS cert stores for rustls
#77
FileMagic
opened
4 months ago
6
codellama unusable with llm-ls 0.5.1
#76
williamspatrick
closed
4 months ago
3
fix: deserialize `url` null value w/ default if `backend: huggingface`
#75
McPatate
closed
4 months ago
0
feat: update backend & model parameter
#74
McPatate
closed
5 months ago
0
fix: helix editor build crash
#73
McPatate
closed
5 months ago
0
feat: add socket connection
#72
McPatate
closed
5 months ago
0
refactor: error handling
#71
McPatate
closed
5 months ago
0
refactor: adaptor -> backend
#70
McPatate
closed
5 months ago
0
refactor: adaptor list should be an enum
#69
McPatate
closed
5 months ago
8
Unauthenticated warning should not show when a custom backend is used
#68
spew
closed
5 months ago
2
use APIParams for requests to TGI API
#67
johan12345
closed
5 months ago
0
Add support for properly interpreting `context.selectedCompletionInfo`
#66
spew
closed
5 months ago
3
Fix TGI generation parameters
#65
HennerM
closed
5 months ago
0
Fix off-by-1 in prompt creation
#64
HennerM
closed
5 months ago
0
Completions not displaying in some cases
#63
Wats0ns
opened
5 months ago
1
Account for FIM tokens in prompt
#62
HennerM
closed
5 months ago
1
Fix off-by-1 error when removing from end of document
#61
HennerM
closed
5 months ago
0
Fix handling of end-of-file
#60
HennerM
closed
5 months ago
0
Ignore documents with "output" scheme
#59
HennerM
closed
5 months ago
0
Only warn of rate-limits when using HF endpoint
#58
HennerM
closed
5 months ago
1
[Suggestion] Metrics support
#56
DanielAdari
opened
6 months ago
3
emacs support?
#55
NightMachinery
opened
6 months ago
1
Cannot build testbed on Windows
#54
noahbald
opened
6 months ago
2
Help: Starting
#53
aemonge
closed
6 months ago
3
add support for Kotlin language
#52
johan12345
closed
6 months ago
0
feat: multi file context
#51
McPatate
opened
7 months ago
0
Disable ropey `unicode_lines` feature
#50
rojas-diego
closed
6 months ago
8
Next