issues
search
David-Kunz
/
gen.nvim
Neovim plugin to generate text using LLMs with customizable prompts
The Unlicense
977
stars
62
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Using Plugged as package manager?
#51
zevv
closed
7 months ago
2
replace = true not replacing
#50
Amzd
opened
7 months ago
8
Some problem "Expected Lua number" when use plugin.
#49
Aldans
closed
7 months ago
2
Including the contents of the open buffers in the context when $buffers is in the prompt
#48
kjjuno
opened
7 months ago
7
media files
#47
David-Kunz
closed
7 months ago
0
Add Telescope Support and prepare for lasting Sessions
#46
cloud-wanderer
closed
3 months ago
4
From mistral:instruct to mistral
#45
weygoldt
closed
7 months ago
4
feat: allow to select a model
#44
David-Kunz
closed
7 months ago
0
Suggestion: add choice of models to switch to
#43
kapral18
closed
7 months ago
3
model change doesn't work
#42
kapral18
closed
7 months ago
2
Lost functionality with output rendered with "termopen"
#41
leafo
closed
7 months ago
3
Ctrl-c or Esc doesn't kill the job and so there is no way to stop the prompt without closing vim or waiting till its over.
#40
JulesHauchecorne
closed
7 months ago
3
[feature request] prompt-answer history
#39
floppydisken
closed
7 months ago
3
[feature request] dialogue with the model
#38
floppydisken
closed
7 months ago
1
Feat: Allow to compare with current buffer
#37
zippeurfou
closed
7 months ago
1
feature: Enable Context
#36
kjjuno
closed
7 months ago
19
Gen.nvim don't work
#35
yesidrs
closed
7 months ago
2
Telescope integration
#34
dj95
closed
7 months ago
2
fix: Output window remains blank
#33
macukadam
closed
7 months ago
0
Output window remains blank
#32
siblanco
closed
7 months ago
40
Adds conversation support
#31
kjjuno
closed
7 months ago
10
Allow for ollama REST API usage
#30
RubenSmn
closed
7 months ago
6
hard to add this plugin using lazy.nvim with the container option setting.
#29
meicale
closed
7 months ago
5
feature request - Add conversation Support
#28
kjjuno
closed
7 months ago
2
fix: escape percentage symbols
#27
David-Kunz
closed
8 months ago
0
feat(config): allow to set require('gen').win_config
#26
David-Kunz
closed
8 months ago
0
Option to enable word wrap?
#25
fredrikaverpil
closed
7 months ago
5
feat: support for Docker container; printing command stderr to output window
#24
wishuuu
closed
8 months ago
0
problem with rust
#23
joske
closed
8 months ago
5
feat: select model with ui
#22
0xfraso
closed
7 months ago
9
fix: cancel job when floating window was closed prematurely
#21
smjonas
closed
8 months ago
1
fix: handle cancellation of vim.ui.select by user
#20
smjonas
closed
8 months ago
1
Fix code with help of LSP
#19
pratikgajjar
closed
7 months ago
3
ask for diff mode
#18
ayoubelmhamdi
closed
7 months ago
3
Docker
#17
Flamme13
closed
3 months ago
9
Which-key seems to "steal" the buffer content
#16
kozer
closed
7 months ago
12
Support defining a prompt as a function in addition to a string
#15
leafo
closed
8 months ago
1
add complete function for :Gen command
#14
leafo
closed
8 months ago
1
Add $register variable for prompts
#13
leafo
closed
8 months ago
1
feat: Support stop serve
#12
nfwyst
closed
8 months ago
2
fix(README): model name
#11
nfwyst
closed
9 months ago
0
Sort prompt keys for the prompt select menu
#10
leafo
closed
9 months ago
1
media
#9
David-Kunz
closed
9 months ago
0
Possible cmp integration
#8
JoseConseco
closed
7 months ago
2
Added option to specify custom llama model with prompt
#7
JoseConseco
closed
9 months ago
2
Add parameters table to prompt
#6
JoseConseco
closed
9 months ago
6
fix the shell escape of single quotes
#5
alaaibrahim
closed
9 months ago
0
Prompt wont work in it has quote in it
#4
JoseConseco
closed
9 months ago
9
Cannot override model
#3
JoseConseco
closed
9 months ago
1
error in readme
#2
JoseConseco
closed
9 months ago
1
Previous
Next