issues
search
huggingface
/
llm.nvim
LLM powered development for Neovim
Apache License 2.0
741
stars
46
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Attempt to call field 'nvim_create_user_command' (a nil value)
#110
shoang22
opened
1 week ago
0
Where should I put the config file?
#109
jm33-m0
opened
1 month ago
2
How to use proxy env var
#108
SethARhodes
opened
1 month ago
0
Inconsistent Virtual Text Placement with Tabs
#107
Nimrod0901
opened
1 month ago
1
How to use openai api?
#106
4t8dd
closed
1 month ago
0
Chatbot with TUI
#104
metal3d
opened
2 months ago
0
add system promt for FIM or other parameters.
#103
meicale
opened
2 months ago
0
fix (keymaps): Prevent key press cancellation when no completions available
#102
hugovntr
opened
2 months ago
0
llm.nvim does not attach to the buffer
#100
rhusiev
opened
3 months ago
5
Can not using the ollama in docker container. ERROR: [LLM] http error
#99
meicale
opened
3 months ago
1
ci(#96): added tag-PR and commitlint workflows
#98
AlejandroSuero
closed
3 months ago
1
feat(#96): improve DX
#97
AlejandroSuero
closed
3 months ago
0
[Feat]: Improve DX
#96
AlejandroSuero
opened
3 months ago
3
feat(ci): add trufflehog secrets detection
#95
McPatate
closed
3 months ago
0
How do you use this?
#94
s1nistr4
opened
3 months ago
0
ollama not working
#93
nfwyst
opened
3 months ago
1
feat(config): add default request body per backend
#92
Robzz
closed
3 months ago
0
Use vim.lsp.start_client instead of vim.lsp.start
#91
blmarket
closed
3 months ago
2
feat: change default model to `starcoder2-15b`
#90
McPatate
closed
4 months ago
0
feat: bump llm-ls to `0.5.3`
#89
McPatate
closed
4 months ago
0
Neovim 0.10.0 support
#88
roman3pm
closed
3 months ago
12
Merge of user config with default config causes issues
#87
Robzz
closed
3 months ago
3
Error starting llm-ls
#86
yduanBioinfo
closed
3 months ago
4
No LSP with LLM
#85
Freaksed
opened
5 months ago
0
Add FIM configuration template for CodeGemma
#84
postmasters
opened
5 months ago
0
Extremely slow on Completion
#83
gzfrozen
closed
4 months ago
2
Allow environment variables to be passed
#82
blmarket
closed
3 months ago
2
No Auto-Completions and weird offset_encodings warning
#81
mAmineChniti
opened
6 months ago
5
Can't get to work with ollama
#79
Bios-Marcel
closed
6 months ago
7
Can't use completions
#77
Freaksed
closed
5 months ago
2
Rate Limit error on locally deployed model
#76
V4G4X
opened
7 months ago
2
Default kaymap disables jomping forward with <C-I> in the jumplist
#75
Trulsaa
opened
7 months ago
0
fix: dangling llm-ls process
#74
McPatate
closed
7 months ago
0
README suggests Ollama should work but it does not
#73
Amzd
closed
7 months ago
1
feat: `0.5.0`
#72
McPatate
closed
7 months ago
0
Unreachable LLM server blocks UI
#71
RemcoSchrijver
opened
7 months ago
1
Ask what to map
#70
aemonge
opened
9 months ago
0
Large Models suuport: too large to be loaded automatically (67GB > 10GB)
#69
aemonge
closed
9 months ago
3
feat: add adaptors to config and completion request
#68
noahbald
closed
7 months ago
1
Pass cmd_env to lsp
#67
devvit
closed
7 months ago
0
expose callbacks
#66
teto
opened
10 months ago
4
lsp server not started properly
#65
teto
opened
11 months ago
2
check for llm-ls in PATH ?
#64
teto
opened
11 months ago
5
feat: add accept & reject completion calls
#63
McPatate
closed
11 months ago
0
feat: bump llm-ls to `0.3.0`
#62
McPatate
closed
11 months ago
0
`Tab` key not usable in insert mode
#61
bogdan-the-great
opened
11 months ago
7
feat: update `llm-ls` to `0.2.2`
#60
McPatate
closed
11 months ago
0
`[LLM] Model bigcode/starcoderbase is currently loading`
#59
bogdan-the-great
closed
11 months ago
4
feat: add `enable_suggestions_on_files` setting
#58
McPatate
closed
12 months ago
0
docs: add PRO plan note
#57
McPatate
closed
12 months ago
0
Next