issues
search
gsuuon
/
model.nvim
Neovim plugin for interacting with LLM's and building editor integrated prompts.
MIT License
291
stars
21
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
`:telescope model mchat`
#59
gsuuon
closed
1 month ago
0
`:Mcancel` does not work for chats
#58
GordianDziwis
closed
2 months ago
2
stop adding separator when completing a partial assistant response
#57
psmitsu
closed
1 month ago
3
Treesitter mchat
#56
gsuuon
closed
2 months ago
0
[feature request] - Integration with TGI
#55
CrossNox
opened
3 months ago
1
Use markdown for mchat filetype
#54
GordianDziwis
closed
2 months ago
4
Add ability to specify ollama url in provider options
#53
gordon-quad
closed
3 months ago
0
`extract_markdown` transform fails on vim.notify
#52
gordon-quad
closed
4 months ago
1
Curl to llama.cpp server fails to parse json if accessed over the network
#51
gordon-quad
closed
4 months ago
10
Cleanup deprecated and lint
#50
gsuuon
closed
5 months ago
0
feat: manage context commands
#49
gsuuon
closed
5 months ago
0
fix: notify when in lua loop callback
#48
somnam
closed
5 months ago
0
imp: linting
#47
randoentity
opened
5 months ago
3
fix: carriage return in response
#45
randoentity
closed
6 months ago
0
Proposal: Use tree-sitter for highlighting in chat sessions
#44
murtaza64
closed
2 months ago
4
switch default openai to gpt4
#43
blankenshipz
closed
6 months ago
1
fix: vim.fs.joinpath not in neovim 0.9.4
#42
TyberiusPrime
closed
6 months ago
3
add langserve provider
#41
wesl-ee
closed
6 months ago
1
add example Mistral AI "La plateforme" configuration (using OpenAI provider)
#40
tgy
closed
6 months ago
1
`llm.nvim` -> `model.nvim`
#39
gsuuon
closed
6 months ago
0
Integration with github copilot
#36
Andrej-Marsic
opened
7 months ago
3
Notify on empty response
#35
gsuuon
opened
8 months ago
0
Failing to connect/start the llama.cpp server
#34
mutlusun
closed
7 months ago
4
Openai proxy
#33
gsuuon
closed
8 months ago
0
Fix sse parsing
#32
gsuuon
closed
8 months ago
0
[Feature request] proxy support for curl?
#31
kohane27
closed
8 months ago
7
Llamacpp autostart
#30
gsuuon
closed
8 months ago
0
how to add codeium?
#29
WillEhrendreich
closed
8 months ago
5
Improve README
#28
gsuuon
opened
8 months ago
1
Setting up llamacpp
#27
rhsimplex
closed
8 months ago
6
Tests
#26
gsuuon
closed
8 months ago
0
Autostart the llama.cpp server example
#22
gsuuon
closed
8 months ago
0
Undojoin segment edits
#21
gsuuon
closed
9 months ago
0
Buffer mode options
#20
Andrej-Marsic
closed
9 months ago
1
[PaLM]: filters.reason = "OTHER"
#19
orhnk
closed
9 months ago
1
Add option to insert streamed responses using :undojoin
#18
MattSPalmer
closed
9 months ago
3
llamacpp provider change to use local server instead of local binary breaks existing prompts using llamacpp
#17
helmling
closed
9 months ago
4
Codellama FIM
#16
gsuuon
closed
9 months ago
0
Support for connection to llamacpp server
#14
JoseConseco
closed
9 months ago
3
llama.cpp usage
#13
Vesyrak
closed
9 months ago
14
Error executing luv callback: Error in system on_stdout handler
#12
judaew
closed
9 months ago
3
`Question` : Change Default provider?
#10
NormTurtle
closed
9 months ago
0
Remove vim system
#7
gsuuon
closed
10 months ago
0
LLM error ['stop']
#6
gdnaesver
closed
10 months ago
1
llama.cpp without neovim nightly build? (0.10)
#5
slenderq
closed
10 months ago
4
Remove ns_id from extmark details
#4
ibash
closed
1 year ago
3
Prompt handlers
#3
gsuuon
closed
1 year ago
0
Query segments
#2
gsuuon
closed
1 year ago
0
Prompt alternatives
#1
gsuuon
closed
1 year ago
0